MondayMap: Grey or Gray Matter?

infographic-spelling_bee_chart

The dumbing down of the United States continues apace. While 9-15 year-olds participating in this year’s national spelling bee competition seem to have no problem with words like “syzygy”, “onomatopoeia” and “triskaidekaphobia”, the general adult population is in dire linguistic straits.

An analysis of online search queries by Vocativ and Google Trends highlights some rather disturbing misspellings of rather common words. Though, what makes the survey so fascinating is to see the variations mapped by state. Sadly, around a dozen states had the most trouble with the word “grey”, including California, Illinois, Kansas, Michigan and North Dakota. Arkansas, on the other hand, should be proud (or not), that its residents have the most trouble with the word “diarrhea” (US spelling), while residents of Idaho can’t seem to spell “antelope”.  Texans get hung up on “beautiful” and those living in Wyoming can’t seem to spell “jelous” [sic].

Read more from Vocativ here.

Infographic courtesy of Vocativ / Google Trends.

 

Regression Texting

emoji-2016

Some culture watchers believe we are entering into a linguistic death spiral. Our increasingly tech-driven communication is enabling our language to evolve in unforeseen ways, and some linguists believe the evolution is actually taking us backwards rather than forwards. Enter exhibit one into the record: the 👿 emoji.

From the Guardian:

So it’s official. We are evolving backwards. Emoji, the visual system of communication that is incredibly popular online, is Britain’s fastest-growing language according to Professor Vyv Evans, a linguist at Bangor University.

The comparison he uses is telling – but not in the way the prof, who appears enthusiastic about emojis, presumably intends. “As a visual language emoji has already far eclipsed hieroglyphics, its ancient Egyptian precursor which took centuries to develop,” says Evans.

Perhaps that is because it is easier to go downhill than uphill. After millennia of painful improvement, from illiteracy to Shakespeare and beyond, humanity is rushing to throw it all away. We’re heading back to ancient Egyptian times, next stop the stone age, with a big yellow smiley grin on our faces.

Unicode, the company that created emojis, has announced it will release 36 more of the brainless little icons next year. Demand is massive: 72% of 18- to 25-year-olds find it easier to express their feelings in emoji pictures than through the written word, according to a survey for Talk Talk mobile.

As tends to happen in an age when technology is transforming culture on a daily basis, people relate such news with bland irony or apparent joy. Who wants to be the crusty old conservative who questions progress? But the simplest and most common-sense historical and anthropological evidence tells us that Emoji is not “progress” by any definition. It is plainly a step back.

Evans compares Emoji with ancient Egyptian hieroglyphics. Well indeed. ancient Egypt was a remarkable civilisation, but it had some drawbacks. The Egyptians created a magnificent but static culture. They invented a superb artistic style and powerful mythology – then stuck with these for millennia. Hieroglyphs enabled them to write spells but not to develop a more flexible, questioning literary culture: they left that to the Greeks.

These jumped-up Aegean loudmouths, using an abstract non-pictorial alphabet they got from the Phoenicians, obviously and spectacularly outdid the Egyptians in their range of expression. The Greek alphabet was much more productive than all those lovely Egyptian pictures. That is why there is no ancient Egyptian Iliad or Odyssey.

In other words, there are harsh limits on what you can say with pictures. The written word is infinitely more adaptable. That’s why Greece rather than Egypt leapt forward and why Shakespeare was more articulate than the Aztecs.

Read the entire article here.

Image: A subset of new emojis proposed for adoption in 2016. The third emoji along the top row is, of course, “selfie”. Courtesy of Unicode.

 

Active SETI

google-search-aliens

Seventy years after the SETI (Search for Extra-Terrestrial Intelligence) experiment began some astronomers are thinking of SETI 2.0 or active SETI. Rather than just passively listening for alien-made signals emanating from the far distant exoplanets these astronomers wish to take the work a bold step further. They’re planning to transmit messages in the hope that someone or something will be listening. And that has opponents of the plan rather worried. If somethings do hear us, will they come looking, and if so, then what? Will the process result in a real-life The Day the Earth Stood Still or Alien? And, more importantly, will they all look astonishingly Hollywood-like?

From BBC:

Scientists at a US conference have said it is time to try actively to contact intelligent life on other worlds.

Researchers involved in the search for extra-terrestrial life are considering what the message from Earth should be.

The call was made by the Search for Extra Terrestrial Intelligence institute at a meeting of the American Association for the Advancement of Science in San Jose.

But others argued that making our presence known might be dangerous.

Researchers at the Seti institute have been listening for signals from outer space for more than 30 years using radio telescope facilities in the US. So far there has been no sign of ET.

The organisation’s director, Dr Seth Shostak, told attendees to the AAAS meeting that it was now time to step up the search.

“Some of us at the institute are interested in ‘active Seti’, not just listening but broadcasting something to some nearby stars because maybe there is some chance that if you wake somebody up you’ll get a response,” he told BBC News.

The concerns are obvious, but sitting in his office at the institute in Mountain View, California, in the heart of Silicon Valley, he expresses them with characteristic, impish glee.

Game over?

“A lot of people are against active Seti because it is dangerous. It is like shouting in the jungle. You don’t know what is out there; you better not do it. If you incite the aliens to obliterate the planet, you wouldn’t want that on your tombstone, right?”

I couldn’t argue with that. But initially, I could scarcely believe I was having this conversation at a serious research institute rather than at a science fiction convention. The sci-fi feel of our talk was underlined by the toy figures of bug-eyed aliens that cheerfully decorate the office.

But Dr Shostak is a credible and popular figure and has been invited to present his arguments.

Leading astronomers, anthropologists and social scientists will gather at his institute after the AAAS meeting for a symposium to flesh out plans for a proposal for active Seti to put to the public and politicians.

High on the agenda is whether such a move would, as he put it so starkly, lead to the “obliteration” of the planet.

“I don’t see why the aliens would have any incentive to do that,” Dr Shostak tells me.

“Beyond that, we have been telling them willy-nilly that we are here for 70 years now. They are not very interesting messages but the early TV broadcasts, the early radio, the radar from the Second World War – all that has leaked off the Earth.

“Any society that could come here and ruin our whole day by incinerating the planet already knows we are here.”

Read the entire article here.

Image courtesy of Google Search.

Myth Busting Silicon(e) Valley

map-Silicon-Valley

Question: what do silicone implants and Silicon Valley have in common?  Answer: they are both instruments of a grandiose illusion. The first, on a mostly personal level, promises eternal youth and vigor; the second, on a much grander scale, promises eternal wealth and greatness for humanity.

So, let’s leave aside the human cosmetic question for another time and concentrate on the broad deception that is current Silicon Valley. It’s a deception at many different levels —  self-deception of Silicon Valley’s young geeks and code jockeys, and the wider delusion that promises us all a glittering future underwritten by rapturous tech.

And, how best to debunk the myths that envelop the Valley like San Francisco’s fog, than to turn to Sam Biddle, former editor of Valleywag. He offers a scathing critique, which happens to be spot on. Quite rightly he asks if we need yet another urban, on-demand laundry app and what on earth is the value to society of “Yo”? But more importantly, he asks us to reconsider our misplaced awe and to knock Silicon Valley from its perch of self-fulfilling self-satisfaction. Yo and Facebook and Uber and Clinkle and Ringly and DogVacay and WhatsApp and the thousands of other trivial start-ups — despite astronomical valuations — will not be humanity’s savior. We need better ideas and deeper answers.

From GQ:

I think my life is better because of my iPhone. Yours probably is, too. I’m grateful to live in a time when I can see my baby cousins or experience any album ever without getting out of bed. I’m grateful that I will literally never be lost again, so long as my phone has battery. And I’m grateful that there are so many people so much smarter than I am who devise things like this, which are magical for the first week they show up, then a given in my life a week later.

We live in an era of technical ability that would have nauseated our ancestors with wonder, and so much of it comes from one very small place in California. But all these unimpeachable humanoid upgrades—the smartphones, the Google-gifted knowledge—are increasingly the exception, rather than the rule, of Silicon Valley’s output. What was once a land of upstarts and rebels is now being led by the money-hungry and the unspirited. Which is why we have a start-up that mails your dog curated treats and an app that says “Yo.” The brightest minds in tech just lately seem more concerned with silly business ideas and innocuous “disruption,” all for the shot at an immense payday. And when our country’s smartest people are working on the dumbest things, we all lose out.

That gap between the Silicon Valley that enriches the world and the Silicon Valley that wastes itself on the trivial is widening daily. And one of the biggest contributing factors is that the Valley has lost touch with reality by subscribing to its own self-congratulatory mythmaking. That these beliefs are mostly baseless, or at least egotistically distorted, is a problem—not just for Silicon Valley but for the rest of us. Which is why we’re here to help the Valley tear down its own myths—these seven in particular.

Myth #1: Silicon Valley Is the Universe’s Only True Meritocracy

 Everyone in Silicon Valley has convinced himself he’s helped create a free-market paradise, the software successor to Jefferson’s brotherhood of noble yeomen. “Silicon Valley has this way of finding greatness and supporting it,” said a member of Greylock Partners, a major venture-capital firm with over $2 billion under management. “It values meritocracy more than anyplace else.” After complaints of the start-up economy’s profound whiteness reached mainstream discussion just last year, companies like Apple, Facebook, and Twitter reluctantly released internal diversity reports. The results were as homogenized as expected: At Twitter, 79 percent of the leadership is male and 72 percent of it is white. At Facebook, senior positions are 77 percent male and 74 percent white. Twitter—a company whose early success can be directly attributed to the pioneering downloads of black smartphone users—hosts an entirely white board of directors. It’s a pounding indictment of Silicon Valley’s corporate psyche that Mark Zuckerberg—a bourgeois white kid from suburban New York who attended Harvard—is considered the Horatio Alger 2.0 paragon. When Paul Graham, the then head of the massive start-up incubator Y Combinator, told The New York Times that he could “be tricked by anyone who looks like Mark Zuckerberg,” he wasn’t just talking about Zuck’s youth.

If there’s any reassuring news, it’s not that tech’s diversity crisis is getting better, but that in the face of so much dismal news, people are becoming angry enough and brave enough to admit that the state of things is not good. Silicon Valley loves data, after all, and with data readily demonstrating tech’s overwhelming white-guy problem, even the true believers in meritocracy see the circumstances as they actually are.

Earlier this year, Ellen Pao became the most mentioned name in Silicon Valley as her gender-discrimination suit against her former employer, Kleiner Perkins Caufield & Byers, played out in court. Although the jury sided with the legendary VC firm, the Pao case was a watershed moment, bringing sunlight and national scrutiny to the issue of unchecked Valley sexism. For every defeated Ellen Pao, we can hope there are a hundred other female technology workers who feel new courage to speak up against wrongdoing, and a thousand male co-workers and employers who’ll reconsider their boys’-club bullshit. But they’ve got their work cut out for them.

Myth #4: School Is for Suckers, Just Drop Out

 Every year PayPal co-founder, investor-guru, and rabid libertarian Peter Thiel awards a small group of college-age students the Thiel Fellowship, a paid offer to either drop out or forgo college entirely. In exchange, the students receive money, mentorship, and networking opportunities from Thiel as they pursue a start-up of their choice. We’re frequently reminded of the tech titans of industry who never got a degree—Steve Jobs, Bill Gates, and Mark Zuckerberg are the most cited, though the fact that they’re statistical exceptions is an aside at best. To be young in Silicon Valley is great; to be a young dropout is golden.

The virtuous dropout hasn’t just made college seem optional for many aspiring barons—formal education is now excoriated in Silicon Valley as an obsolete system dreamed up by people who’d never heard of photo filters or Snapchat. Mix this cynicism with the libertarian streak many tech entrepreneurs carry already and you’ve got yourself a legit anti-education movement.

And for what? There’s no evidence that avoiding a conventional education today grants business success tomorrow. The gifted few who end up dropping out and changing tech history would have probably changed tech history anyway—you can’t learn start-up greatness by refusing to learn in a college classroom. And given that most start-ups fail, do we want an appreciable segment of bright young people gambling so heavily on being the next Zuck? More important, do we want an economy of CEOs who never had to learn to get along with their dorm-mates? Who never had the opportunity to grow up and figure out how to be a human being functioning in society? Who went straight from a bedroom in their parents’ house to an incubator that paid for their meals? It’s no wonder tech has an antisocial rep.

Myth #7: Silicon Valley Is Saving the World

Two years ago an online list of “57 start-up lessons” made its way through the coder community, bolstered by a co-sign from Paul Graham. “Wow, is this list good,” he commented. “It has the kind of resonance you only get when you’re writing from a lot of hard experience.” Among the platitudinous menagerie was this gem: “If it doesn’t augment the human condition for a huge number of people in a meaningful way, it’s not worth doing.” In a mission statement published on Andreessen Horowitz’s website, Marc Andreessen claimed he was “looking for the companies who are going to be the big winners because they are going to cause a fundamental change in the world.” The firm’s portfolio includes Ringly (maker of rings that light up when your phone does something), Teespring (custom T-shirts), DogVacay (pet-sitters on demand), and Hem (the zombified corpse of the furniture store Fab.com). Last year, wealthy Facebook alum Justin Rosenstein told a packed audience at TechCrunch Disrupt, “We in this room, we in technology, have a greater capacity to change the world than the kings and presidents of even a hundred years ago.” No one laughed, even though Rosenstein’s company, Asana, sells instant-messaging software.

 This isn’t just a matter of preening guys in fleece vests building giant companies predicated on their own personal annoyances. It’s wasteful and genuinely harmful to have so many people working on such trivial projects (Clinkle and fucking Yo) under the auspices of world-historical greatness. At one point recently, there were four separate on-demand laundry services operating in San Francisco, each no doubt staffed by smart young people who thought they were carving out a place of small software greatness. And yet for every laundry app, there are smart people doing smart, valuable things: Among the most recent batch of Y Combinator start-ups featured during March’s “Demo Day” were Diassess (twenty-minute HIV tests), Standard Cyborg (3D-printed limbs), and Atomwise (using supercomputing to develop new medical compounds). Those start-ups just happen to be sharing desk space at the incubator with “world changers” like Lumi (easy logo printing) and Underground Cellar (“curated, limited-edition wines with a twist”).

Read the entire article here.

Map: Silicon Valley, CA. Courtesy of Google.

 

Innovating the Disruption Or Disrupting the Innovation

Corporate America has a wonderful knack of embracing a meaningful idea and then overusing it to such an extent that it becomes thoroughly worthless. Until recently, every advertiser, every manufacturer, every service, shamelessly promoted itself as an innovator. Everything a company did was driven by innovation: employees succeeded by innovating; the CEO was innovation incarnate; products were innovative; new processes drove innovation — in fact, the processes themselves were innovative. Any business worth its salt produced completely innovative stuff from cupcakes to tires, from hair color to drill bits, from paper towels to hoses. And consequently this overwhelming ocean of innovation — which upon closer inspection actually isn’t real innovation — becomes worthless, underwhelming drivel.

So, what next for corporate America? Well, latch on to the next meme of course — disruption. Yawn.

From NPR/TED:

HBO’s Silicon Valley is back, with its pitch-perfect renderings of the culture and language of the tech world — like at the opening of the “Disrupt” startup competition run by the Tech Crunch website at the end of last season. “We’re making the world a better place through scalable fault-tolerant distributed databases” — the show’s writers didn’t have to exercise their imagination much to come up with those little arias of geeky self-puffery, or with the name Disrupt, which, as it happens, is what the Tech Crunch conferences are actually called. As is most everything else these days. “Disrupt” and “disruptive” are ubiquitous in the names of conferences, websites, business school degree programs and business book best-sellers. The words pop up in more than 500 TED Talks: “How to Avoid Disruption in Business and in Life,” “Embracing Disruption,” “Disrupting Higher Education,” “Disrupt Yourself.” It transcends being a mere buzzword. As the philosopher Jeremy Bentham said two centuries ago, there is a point where jargon becomes a species of the sublime.

 To give “disruptive” its due, it actually started its life with some meat on its bones. It was popularized in a 1997 book by Clayton Christensen of the Harvard Business School. According to Christensen, the reason why established companies fail isn’t that they don’t keep up with new technologies, but that their business models are disrupted by scrappy, bottom-fishing startups that turn out stripped-down versions of existing products at prices the established companies can’t afford to match. That’s what created an entry point for “disruptive innovations” like the Model T Ford, Craigslist classifieds, Skype and no-frills airlines.

Christensen makes a nice point. Sometimes you can get the world to beat a path to your door by building a crappier mousetrap, too, if you price it right. Some scholars have raised questions about that theory, but it isn’t the details of the story that have put “disruptive” on everybody’s lips; it’s the word itself. Buzzwords feed off their emotional resonances, not their ideas. And for pure resonance, “disruptive” is hard to beat. It’s a word with deep roots. I suspect I first encountered it when my parents read me the note that the teacher pinned to my sweater when I was sent home from kindergarten. Or maybe it reminds you of the unruly kid who was always pushing over the juice table. One way or another, the word evokes obstreperous rowdies, the impatient people who are always breaking stuff. It says something that “disrupt” is from the Latin for “shatter.”

Disrupt or be disrupted. The consultants and business book writers have proclaimed that as the chronic condition of the age, and everybody is clambering to be classed among the disruptors rather than the disruptees. The lists of disruptive companies in the business media include not just Amazon and Uber but also Procter and Gamble and General Motors. What company nowadays wouldn’t claim to be making waves? It’s the same with that phrase “disruptive technologies.” That might be robotics or next-generation genomics, sure. But CNBC also touts the disruptive potential of an iPhone case that converts to a video game joystick.

These days, people just use “disruptive” to mean shaking things up, though unlike my kindergarten teacher, they always infuse a note of approval. As those Tech Crunch competitors assured us, disruption makes the world a better place. Taco Bell has created a position called “Resident Disruptor,” and not to be outdone, McDonald’s is running radio ads describing its milkshake blenders as a disruptive technology. Well, OK, blenders really do shake things up. But by the time a tech buzzword has been embraced by the fast food chains, it’s getting a little frayed at the edges. “Disruption” was never really a new idea in the first place, just a new name for a fact of life as old as capitalism. Seventy years ago the economist Joseph Schumpeter was calling it the “gales of creative destruction,” and he just took the idea from Karl Marx.

Read the entire story here.

The Free Market? Yeah Right

The US purports to be home of the free market. But we all know it’s not. Rather, it is home to vested and entrenched interests who will fight tooth-and-nail to maintain the status quo beneath the veil of self-written regulations and laws. This is called protectionism — manufacturers, media companies, airlines and suppliers all do it. Texas car dealers and their corporate lobbyists are masters at this game.

From ars technica:

In a turn of events that isn’t terribly surprising, a bill to allow Tesla Motors to sell cars directly to consumers in Texas has failed to make it to the floor, with various state representatives offering excuses about not wanting to “piss off all the auto dealers.”

The Lone Star State’s notoriously anti-Tesla stance—one of the strongest in the nation—is in many ways the direct legacy of powerful lawmaker-turned-lobbyist Gene Fondren, who spent much of his life ensuring that the Texas Automobile Dealers Association’s wishes were railroaded through the Texas legislature.

That legacy is alive and well, with Texas lawmakers refusing to pass bills in 2013 and again in 2015 to allow Tesla to sell to consumers. Per the state’s franchise laws, auto manufacturers like Tesla are only allowed to sell cars to independent third-party dealers. These laws were originally intended to protect consumers against the possibility of automakers colluding on pricing; today, though, they function as protectionist shields for the entrenched political interests of car dealers and their powerful state- and nationwide lobbyist organizations.

The anti-Tesla sentiment didn’t stop Texas from attempting to snag the contracts for Tesla Motors’ upcoming “Gigafactory,” the multibillion dollar battery factory that Tesla Motors CEO Elon Musk eventually chose to build in Reno, Nevada.

Speaking of Elon Musk—in a stunning display of total ignorance, Texas state representative Senfronia Thompson (a Democrat representing House District 141) had this to say about the bill’s failure: “I can appreciate Tesla wanting to sell cars, but I think it would have been wiser if Mr. Tesla had sat down with the car dealers first.”

 Apparently being even minimally familiar with the matters one legislates isn’t a requirement to serve in the Texas legislature. However, Thompson did receive many thousands in campaign contributions from the Texas Automobile Dealers Association, so perhaps she’s just doing what she’s told.

Read the entire story here.

Death Explained

StillLifeWithASkull

Let’s leave the mysteries of the spiritual after-life aside for our various religions to fight over, and concentrate on what really happens after death. It many not please many aesthetes, but the cyclic process is beautiful nonetheless.

From Raw Story:

“It might take a little bit of force to break this up,” says mortician Holly Williams, lifting John’s arm and gently bending it at the fingers, elbow and wrist. “Usually, the fresher a body is, the easier it is for me to work on.”

Williams speaks softly and has a happy-go-lucky demeanour that belies the nature of her work. Raised and now employed at a family-run funeral home in north Texas, she has seen and handled dead bodies on an almost daily basis since childhood. Now 28 years old, she estimates that she has worked on something like 1,000 bodies.

Her work involves collecting recently deceased bodies from the Dallas–Fort Worth area and preparing them for their funeral.

“Most of the people we pick up die in nursing homes,” says Williams, “but sometimes we get people who died of gunshot wounds or in a car wreck. We might get a call to pick up someone who died alone and wasn’t found for days or weeks, and they’ll already be decomposing, which makes my work much harder.”

John had been dead about four hours before his body was brought into the funeral home. He had been relatively healthy for most of his life. He had worked his whole life on the Texas oil fields, a job that kept him physically active and in pretty good shape. He had stopped smoking decades earlier and drank alcohol moderately. Then, one cold January morning, he suffered a massive heart attack at home (apparently triggered by other, unknown, complications), fell to the floor, and died almost immediately. He was just 57 years old.

Now, John lay on Williams’ metal table, his body wrapped in a white linen sheet, cold and stiff to the touch, his skin purplish-grey – telltale signs that the early stages of decomposition were well under way.

Self-digestion

Far from being ‘dead’, a rotting corpse is teeming with life. A growing number of scientists view a rotting corpse as the cornerstone of a vast and complex ecosystem, which emerges soon after death and flourishes and evolves as decomposition proceeds.

Decomposition begins several minutes after death with a process called autolysis, or self-digestion. Soon after the heart stops beating, cells become deprived of oxygen, and their acidity increases as the toxic by-products of chemical reactions begin to accumulate inside them. Enzymes start to digest cell membranes and then leak out as the cells break down. This usually begins in the liver, which is rich in enzymes, and in the brain, which has a high water content. Eventually, though, all other tissues and organs begin to break down in this way. Damaged blood cells begin to spill out of broken vessels and, aided by gravity, settle in the capillaries and small veins, discolouring the skin.

Body temperature also begins to drop, until it has acclimatised to its surroundings. Then, rigor mortis – “the stiffness of death” – sets in, starting in the eyelids, jaw and neck muscles, before working its way into the trunk and then the limbs. In life, muscle cells contract and relax due to the actions of two filamentous proteins (actin and myosin), which slide along each other. After death, the cells are depleted of their energy source and the protein filaments become locked in place. This causes the muscles to become rigid and locks the joints.

During these early stages, the cadaveric ecosystem consists mostly of the bacteria that live in and on the living human body. Our bodies host huge numbers of bacteria; every one of the body’s surfaces and corners provides a habitat for a specialised microbial community. By far the largest of these communities resides in the gut, which is home to trillions of bacteria of hundreds or perhaps thousands of different species.

The gut microbiome is one of the hottest research topics in biology; it’s been linked to roles in human health and a plethora of conditions and diseases, from autism and depression to irritable bowel syndrome and obesity. But we still know little about these microbial passengers. We know even less about what happens to them when we die.

Putrefaction

Scattered among the pine trees in Huntsville, Texas, lie around half a dozen human cadavers in various stages of decay. The two most recently placed bodies are spread-eagled near the centre of the small enclosure with much of their loose, grey-blue mottled skin still intact, their ribcages and pelvic bones visible between slowly putrefying flesh. A few metres away lies another, fully skeletonised, with its black, hardened skin clinging to the bones, as if it were wearing a shiny latex suit and skullcap. Further still, beyond other skeletal remains scattered by vultures, lies a third body within a wood and wire cage. It is nearing the end of the death cycle, partly mummified. Several large, brown mushrooms grow from where an abdomen once was.

For most of us the sight of a rotting corpse is at best unsettling and at worst repulsive and frightening, the stuff of nightmares. But this is everyday for the folks at the Southeast Texas Applied Forensic Science Facility. Opened in 2009, the facility is located within a 247-acre area of National Forest owned by Sam Houston State University (SHSU). Within it, a nine-acre plot of densely wooded land has been sealed off from the wider area and further subdivided, by 10-foot-high green wire fences topped with barbed wire.

In late 2011, SHSU researchers Sibyl Bucheli and Aaron Lynne and their colleagues placed two fresh cadavers here, and left them to decay under natural conditions.

Once self-digestion is under way and bacteria have started to escape from the gastrointestinal tract, putrefaction begins. This is molecular death – the breakdown of soft tissues even further, into gases, liquids and salts. It is already under way at the earlier stages of decomposition but really gets going when anaerobic bacteria get in on the act.

Putrefaction is associated with a marked shift from aerobic bacterial species, which require oxygen to grow, to anaerobic ones, which do not. These then feed on the body’s tissues, fermenting the sugars in them to produce gaseous by-products such as methane, hydrogen sulphide and ammonia, which accumulate within the body, inflating (or ‘bloating’) the abdomen and sometimes other body parts.

This causes further discolouration of the body. As damaged blood cells continue to leak from disintegrating vessels, anaerobic bacteria convert haemoglobin molecules, which once carried oxygen around the body, into sulfhaemoglobin. The presence of this molecule in settled blood gives skin the marbled, greenish-black appearance characteristic of a body undergoing active decomposition.

Colonisation

When a decomposing body starts to purge, it becomes fully exposed to its surroundings. At this stage, the cadaveric ecosystem really comes into its own: a ‘hub’ for microbes, insects and scavengers.

Two species closely linked with decomposition are blowflies and flesh flies (and their larvae). Cadavers give off a foul, sickly-sweet odour, made up of a complex cocktail of volatile compounds that changes as decomposition progresses. Blowflies detect the smell using specialised receptors on their antennae, then land on the cadaver and lay their eggs in orifices and open wounds.

Each fly deposits around 250 eggs that hatch within 24 hours, giving rise to small first-stage maggots. These feed on the rotting flesh and then moult into larger maggots, which feed for several hours before moulting again. After feeding some more, these yet larger, and now fattened, maggots wriggle away from the body. They then pupate and transform into adult flies, and the cycle repeats until there’s nothing left for them to feed on.

Under the right conditions, an actively decaying body will have large numbers of stage-three maggots feeding on it. This ‘maggot mass’ generates a lot of heat, raising the inside temperature by more than 10°C. Like penguins huddling in the South Pole, individual maggots within the mass are constantly on the move. But whereas penguins huddle to keep warm, maggots in the mass move around to stay cool.

“It’s a double-edged sword,” Bucheli explains, surrounded by large toy insects and a collection of Monster High dolls in her SHSU office. “If you’re always at the edge, you might get eaten by a bird, and if you’re always in the centre, you might get cooked. So they’re constantly moving from the centre to the edges and back.”

Purging

“We’re looking at the purging fluid that comes out of decomposing bodies,” says Daniel Wescott, director of the Forensic Anthropology Center at Texas State University in San Marcos.

Wescott, an anthropologist specialising in skull structure, is using a micro-CT scanner to analyse the microscopic structure of the bones brought back from the body farm. He also collaborates with entomologists and microbiologists – including Javan, who has been busy analysing samples of cadaver soil collected from the San Marcos facility – as well as computer engineers and a pilot, who operate a drone that takes aerial photographs of the facility.

“I was reading an article about drones flying over crop fields, looking at which ones would be best to plant in,” he says. “They were looking at near-infrared, and organically rich soils were a darker colour than the others. I thought if they can do that, then maybe we can pick up these little circles.”

Those “little circles” are cadaver decomposition islands. A decomposing body significantly alters the chemistry of the soil beneath it, causing changes that may persist for years. Purging – the seeping of broken-down materials out of what’s left of the body – releases nutrients into the underlying soil, and maggot migration transfers much of the energy in a body to the wider environment. Eventually, the whole process creates a ‘cadaver decomposition island’, a highly concentrated area of organically rich soil. As well as releasing nutrients into the wider ecosystem, this attracts other organic materials, such as dead insects and faecal matter from larger animals.

According to one estimate, an average human body consists of 50–75 per cent water, and every kilogram of dry body mass eventually releases 32 g of nitrogen, 10 g of phosphorous, 4 g of potassium and 1 g of magnesium into the soil. Initially, it kills off some of the underlying and surrounding vegetation, possibly because of nitrogen toxicity or because of antibiotics found in the body, which are secreted by insect larvae as they feed on the flesh. Ultimately, though, decomposition is beneficial for the surrounding ecosystem.

According to the laws of thermodynamics, energy cannot be created or destroyed, only converted from one form to another. In other words: things fall apart, converting their mass to energy while doing so. Decomposition is one final, morbid reminder that all matter in the universe must follow these fundamental laws. It breaks us down, equilibrating our bodily matter with its surroundings, and recycling it so that other living things can put it to use.

Ashes to ashes, dust to dust.

Read the entire article here.

Image: Still-Life with a Skull, 17th-century painting by Philippe de Champaigne. Public Domain.

Texas Needs More Guns, Not Less

[tube]Cgvq0iOrZ_k[/tube]

Nine dead. Waco, Texas. May 17, 2015. Gunfight at Twin Peaks restaurant.

What this should tell us, particularly gun control advocates, is that Texans need more guns. After all, the US typically loosens gun restrictions after major gun related massacres — the only “civilized” country to do so.

Lawmakers recently passed two open carry gun laws in the Texas Senate. Once reconciled the paranoid governor — Greg Abbott, will surely sign. But even though this means citizens of the Lone State State will then be able to openly run around in public, go shopping or visit the local movie theater while packing a firearm, they still can’t walk around with an alcoholic beverage. Incidentally, in 2013 in the US 1,075 people under the age of 19 were killed by guns. That’s more children dying from gunfire than annual military casualties in Iraq and Afghanistan.

But, let’s leave the irony of this situation aside and focus solely on some good old fashioned sarcasm. Surely, it’s time to mandate that all adults in Texas should be required to carry a weapon. Then there would be less gunfights, right? And, while the Texas Senate is busy with the open carry law perhaps State Senators should mandate that all restaurants install double swinging doors, just like those seen in the saloons of classic TV Westerns.

From the Guardian:

Nine people were killed on Sunday and some others injured after a shootout erupted among rival biker gangs at a Central Texas restaurant, sending patrons and bystanders fleeing for safety, a police spokesman said.

The violence erupted shortly after noon at a busy Waco marketplace along Interstate 35 that draws a large lunchtime crowd. Waco police Sergeant W Patrick Swanton said eight people died at the scene of the shooting at a Twin Peaks restaurant and another person died at a hospital.

It was not immediately clear if bystanders were among the dead, although a local TV station, KCEN-TV, reported that all of the fatalities were bikers and police confirmed that no officers had been injured or killed.

Another local station, KXXV, reported that police had recovered firearms, knives, bats and chains from the scene. Restaurant employees locked themselves in freezers after hearing the shots, the station said.

How many injuries had occurred and the severity of those injuries was not known.

“There are still bodies on the scene of the parking lot at Twin Peaks,” Swanton said. “There are bodies that are scattered throughout the parking lot of the next adjoining business.”

A photograph from the scene showed dozens of motorcycles parked in a lot. Among the bikes, at least three people wearing what looked like biker jackets were on the ground, two on their backs and one face down. Police were standing a few feet away in a group. Several other people also wearing biker jackets were standing or sitting nearby.

Swanton said police were aware in advance that at least three rival gangs would be gathering at the restaurant and at least 12 Waco officers in addition to state troopers were at the restaurant when the fight began.

When the shooting began in the restaurant and then continued outside, armed bikers were shot by officers, Swanton said, explaining that the actions of law enforcement prevented further deaths.

Read the entire article here.

Video: Great Western Movie Themes.

MondayMap: The State of Death

distinctive-causes-of-death-by-state

It’s a Monday, so why not dwell on an appropriately morbid topic — death. Or, to be more precise, a really cool map that shows the most distinctive causes of death for each state. We know that across the United States in general the most common causes of death are heart disease and cancer. However, looking a little deeper shows other, secondary causes that vary by state. So, leaving aside the top two, you will see that a resident of Tennessee is more likely to die from “accidental discharge of firearms”, while someone from Alabama will succumb to syphilis. Interestingly, Texans are more likely to depart this mortal coil from tuberculosis; Georgians from “abnormal clinical problems not elsewhere classified”. While Alaskans — no surprise here — lead the way in deaths from airplane, boating and “unspecified transport accidents”.

Read more here.

Map: Distinctive cause of death by state. Courtesy of Francis Boscoe, New York State Cancer Registry.

 

Satan’s Copper

1935_Indian_Head_Buffalo_Nickel

Nickel is element 28 on the periodic table. Its name was bestowed by German copper miners and is derived from the word “kupfernickel”, which translates to “little nick’s copper”. Besides being a component of the eponymous US five-cent coin — nowadays it’s actually 75 percent copper — it has some rather surprising uses, from making margarine to forming critical elements of modern jet engines.

From the BBC:

It made the age of cheap foreign holidays possible, and for years it was what made margarine spreadable. Nickel may not be the flashiest metal but modern life would be very different without it.

Deep in the bowels of University College London lies a machine workshop, where metals are cut, lathed and shaped into instruments and equipment for the various science departments.

Chemistry professor Andrea Sella stands before me holding a thick, two-metre-long pipe made of Monel, a nickel-copper alloy. Then he lets it fall to the ground with a deafening clang.

“That really speaks to the hardness and stiffness of this metal,” he explains, picking up the undamaged pipe.

But another reason Monel is a “fantastic alloy”, he says, is that it resists corrosion. Chemists need ways of handling highly reactive materials – powerful acids perhaps, or gases like fluorine and chlorine – so they need something that won’t itself react with them.

Gold, silver or platinum might do, but imagine the price of two-meter-long pipe made of gold. Nickel by contrast is cheap and abundant, so it crops up everywhere where corrosion is a concern – from chemist’s spatulas to the protective coating on bicycle sprockets.

But nickel can produce other alloys far quirkier than Monel, Sella is eager to explain.

Take Invar, an alloy of nickel and iron. Uniquely, it hardly expands or contracts with changes in the temperature – a property that comes in very handy in precision instruments and clocks, whose workings can be interfered with by the “thermal expansion” of other lowlier metals.

Then there is Nitinol.

Sella produces a wire in the shape of a paperclip – but it is far too easy to twist out of shape to be of use holding sheets of paper together. He mangles it in his fingers, then dips it into a cup of boiling water. It immediately writhes about… and turns back into a perfect paperclip.

Nitinol has a special memory for the shape in which it is first formed. And its composition can be tuned, so that at a particular temperature it will always return to that original shape. This means, for example, that a rolled-up Nitinol stent can be inserted into a blood vessel. As it warms to body temperature, the stent opens itself out, allowing blood to flow through it.

But all these alloys pale in significance compared to a special class of alloys – so special they are called “superalloys”. These are the alloys that made the jet age possible.

The first jet engines were developed simultaneously in the 1930s and 40s, by Frank Whittle in the UK and by Hans von Ohain in Germany, both on opposing sides of an accelerating arms race.

Those engines, made of steel, had serious shortcomings.

“They didn’t have the temperature capability to go above about 500C,” explains Mike Hicks, head of materials at Rolls-Royce, the UK’s biggest manufacturer of jet turbines. “Its strength falls off quite quickly and its corrosion resistance isn’t good.”

In response, the Rolls-Royce team that took up Whittle’s work in the 1940s went back to the drawing board – one with the periodic table pinned on to it.

Tungsten was too heavy. Copper melted at too low a temperature. But nickel – with a bit of chromium mixed in – was the Goldilocks recipe. It tolerated high temperatures, it was strong, corrosion-resistant, cheap and light.

Today, the descendants of these early superalloys still provide most of the back end of turbines – both those used on jet planes, and those used in power generation.

“The turbine blades have to operate in the hottest part of the engine, and it’s spinning at a very high speed,” says Hicks’s colleague Neil Glover, head of materials technology research at Rolls-Royce.

“Each one of these blades extracts the same power as a Formula 1 racing car engine, and there are 68 of these in the core of the modern gas turbine engine.”

Read the entire article here.

Image: 1935 Buffalo Nickel. Public Domain.

Finally A Reason For Twitter

Florida Man (@_FloridaMan) finally brings it all into sharp and hysterical focus. Now, I may have a worthy reason for joining the Twitterscape and actually following someone.

From the NYT:

Dangling into the sea like America’s last-ditch lifeline, the state of Florida beckons. Hustlers and fugitives, million-dollar hucksters and harebrained thieves, Armani-wearing drug traffickers and hapless dope dealers all congregate, scheme and revel in the Sunshine State. It’s easy to get in, get out or get lost.

For decades, this cast of characters provided a diffuse, luckless counternarrative to the salt-and-sun-kissed Florida that tourists spy from their beach towels. But recently there arrived a digital-era prototype, @_FloridaMan, a composite of Florida’s nuttiness unspooled, tweet by tweet, to the world at large. With pithy headlines and links to real news stories, @_FloridaMan offers up the “real-life stories of the world’s worst super hero,” as his Twitter bio proclaims.

His more than 1,600 tweets — equal parts ode and derision — are a favorite for weird-news aficionados. Yet, two years since his 2013 debut, the man behind the Twitter feed remains beguilingly anonymous, a Wizard of LOLZ. (The one false note is his zombielike avatar: The mug shot belongs to an Indiana Man.)

His style is deceptively simple. Nearly every Twitter message begins “Florida Man.” What follows, though, is almost always a pile of trouble. Some examples:

Florida Man Tries to Walk Out of Store With Chainsaw Stuffed Down His Pants.

Florida Man Falls Asleep During Sailboat Burglary With Gift Bag on His Head; Can’t Be Woken by Police.

Florida Man Arrested For Directing Traffic While Also Urinating.

Florida Man Impersonates Police Officer, Accidentally Pulls Over Real Police Officer.

Florida Man Says He Only Survived Ax Attack By Drunk Stripper Because “Her Coordination Was Terrible.”

“Now I think there are people who actually aspire to Florida Man-ness,” said Dave Barry, who celebrates Florida’s brand of madness in his popular columns and best-selling books. “It’s like the big leagues. It’s the Broadway for idiots.”

The number of @_FloridaMan’s followers is 270,000. Homages have proliferated: fan art, copycat Twitter feeds (California Man, Texas Man) and, most recently, a craft beer with Florida Man’s avatar.

Florida Man is considerably more popular (and funny) than competitors like Texas Man (732 followers) or California Man (129). But is the Florida Man who Accidentally Shoots Himself With Stun Gun While Trying to Rob the Radio Shack He Also Works At truly more wacky than, let’s say, an Arkansas Man or New Jersey Man?

Read the entire story here.

Your Goldfish is Better Than You

Common_goldfish

Well, perhaps not at philosophical musings or mathematics. But, your little orange aquatic friend now has an attention span that is longer than yours. And, it’s all thanks to mobile devices and multi-tasking on multiple media platforms. [Psst, by the way, multi-tasking at the level of media consumption is a fallacy]. On average, the adult attention span is now down to a laughingly paltry 8 seconds, whereas the lowly goldfish comes in at 9 seconds. Where of course that leaves your inbetweeners and teenagers is anyone’s guess.

From the Independent:

Humans have become so obsessed with portable devices and overwhelmed by content that we now have attention spans shorter than that of the previously jokingly juxtaposed goldfish.

Microsoft surveyed 2,000 people and used electroencephalograms (EEGs) to monitor the brain activity of another 112 in the study, which sought to determine the impact that pocket-sized devices and the increased availability of digital media and information have had on our daily lives.

Among the good news in the 54-page report is that our ability to multi-task has drastically improved in the information age, but unfortunately attention spans have fallen.

In 2000 the average attention span was 12 seconds, but this has now fallen to just eight. The goldfish is believed to be able to maintain a solid nine.

“Canadians [who were tested] with more digital lifestyles (those who consume more media, are multi-screeners, social media enthusiasts, or earlier adopters of technology) struggle to focus in environments where prolonged attention is needed,” the study reads.

“While digital lifestyles decrease sustained attention overall, it’s only true in the long-term. Early adopters and heavy social media users front load their attention and have more intermittent bursts of high attention. They’re better at identifying what they want/don’t want to engage with and need less to process and commit things to memory.”

Anecdotely, many of us can relate to the increasing inability to focus on tasks, being distracted by checking your phone or scrolling down a news feed.

Another recent study by the National Centre for Biotechnology Information and the National Library of Medicine in the US found that 79 per cent of respondents used portable devices while watching TV (known as dual-screening) and 52 per cent check their phone every 30 minutes.

Read the entire story here.

Image: Common Goldfish. Public Domain.

 

Atheists Growing, But Still Remain Hated

infographic-atheism-2014

While I’ve lived in the United States for quite some time now it continues to perplex. It may still be a land of opportunity, but it remains a head-scratching paradox. Take religion. On the one hand, a recent survey by the Pew Research Center found that 22.8 percent of the adult population has no religious affiliation. That is, almost one quarter is atheist, agnostic or has no identification with any organized religion. This increased from 16 percent a mere seven years earlier. Yet, on the other hand, atheists and non-believers make up one of the most hated groups in the country — second only to Muslims. And, I don’t know where Satanists figure in this analysis.

Pew’s analysis also dices the analysis by political affiliation, and to no surprise, finds that Republicans generally hate atheists more than those on the left of the political spectrum. For Pew’s next research effort I would suggest they examine which religious affiliations hate atheists the most.

From the Guardian:

The dominant Christian share of the American population is falling sharply while the number of US adults who do not believe in God or prefer not to identify with any organized religion is growing significantly, according to a new report.

The trend is affecting Americans across the country and across all demographics and age groups – but is especially pronounced among young people, the survey by the Pew Research Center found.

In the last seven years, the proportion of US adults declaring themselves Christian fell from 78.4% to 70.6%, with the mainstream protestant, Catholic and evangelical protestant faiths all affected.

Over the same period, those in the category that Pew labeled religiously “unaffiliated” – those describing themselves as atheist, agnostic or “nothing in particular” – jumped from 16.1% of the population to between a fifth and a quarter, at 22.8%, the report, released on Tuesday, found.

“The US remains home to more Christians than any other country in the world, and a large majority of Americans continue to identify with some branch of the Christian faith, but the percentage of adults who describe themselves as Christians has dropped by almost eight points since 2007,” the survey found.

The change in non-Christian religious faiths, including Jews, Muslims, Buddhists, Hindus and “other world religions and faiths” crept up modestly from 4.7% to 5.9% of US adults.

“The younger generation seem much less involved in organized religion and the older generation is passing on, which is a very important factor,” John Green, a professor of political science at the University of Akron in Ohio and an adviser on the survey, told the Guardian.

Tuesday’s report is called the Religious Landscape Study and is the second of its kind prepared by the Pew Research Center.

Pew first conducted such a survey in 2007 and repeated it in 2014 then made comparisons.

The US census does not ask Americans to specify their religion, and there are no official government statistics on the religious composition of the US population, the report pointed out, adding that researchers gathered their material by conducting the survey in Spanish and English across a nationally representative sample of 35,000 US adults.

Green said there were a number of different theories behind more young people eschewing organized religion.

“The involvement of religious groups in politics, particularly regarding issues such as same sex marriage and abortion, is alienating younger adults, who tend to have more liberal and progressive views than older people,” he said.

The rise of the internet and social media has also drawn younger adults towards online, general social groups and away from face-to-face organizations and traditional habits, such as churchgoing, he said.

And there is a theory that the fact that more young people in this generation are going to college is linked to their falling interest in organized religion, he said.

Read the entire story here.

Infographic courtesy of the Pew Research Center.

Self-Absorbed? Rejoice!

aricsnee-selfie-arm

From a culture that celebrates all things selfie comes the next logical extension. An invention that will surely delight any image-conscious narcissist.

The “selfie arm” is a wonderful tongue-firmly-in-cheek invention of artists Aric Snee and Justin Crowe. Their aim, to comment on the illusion of sociableness and connectedness. Thankfully they plan to only construct 10 of these contraptions. But, you know, somewhere and soon, a dubious entrepreneur will be hawking these for $19.95.

One can only hope that the children of Gen-Selfie will eventually rebel against their self-absored parents — until then I’m crawling back under my rock.

From Wired UK:

A selfie stick designed to look like a human arm will ensure you never look alone, but always feel alone. The accessory is designed to make it appear that a lover or friend is holding your hand while taking a photo, removing the crushing sense of narcissistic loneliness otherwise swamping your existence.

The prototype ‘selfie arm’ is the work of artists Justin Crowe and Aric Snee and isn’t intended to be taken seriously. Made of fibreglass, the selfie arm was created in protest against the “growing selfie stick phenomenon, and the constant, gnawing need for narcissistic internet validation,” according to Designboom.

Read the entire article here.

Image: Selfie arm by Aric Snee and Justin Crowe. Courtesy of Aric Snee and Justin Crowe.

Hard Work Versus Smart Work

If you work any kind of corporate job it’s highly likely that you’ll hear any of the following on an almost daily basis: “good job, all those extra hours you put in really paid off”, “I always eat lunch at my desk”, “yes… worked late again yesterday”, “… are you staying late too?”, “I know you must have worked so many long hours to get the project done”, “I’m really impressed at the hours you dedicate…”, “what a team, you all went over and above… working late, working weekends, sacrificing vacation…”, and so on.

The workaholic culture — particularly in the United States — serves to reinforce the notion that hard work is actually to be rewarded and reinforced. Many just seem to confuse long hours for persistence and resilience. On the surface it seems to be a great win for the employer: get more hours out of your employees, and it’s free. Of course, recent analyses of work-life balance show that pushing employees beyond a certain number of hours is thoroughly counterproductive — beyond the deleterious effects on employees the quality of the work suffers too. But it turns out that a not insignificant number of wily subordinates may actually be gaming the 80-hour workweek. And, don’t forget the other group of hard-workers — those who do endless hours of so-called “busy work” just to look hardworking.

What happened to just encouraging and incentivizing  employees, and bosses, for working smartly, rather than just hard? Reward long hours and there is no incentive for innovation or change; reward smartness and creativity thrives. The current mindset may take generations to alter — you’ll easily come across the word “hardworking” in the dictionary, but you’ll have no luck finding “smartworking“.

From the NYT:

Imagine an elite professional services firm with a high-performing, workaholic culture. Everyone is expected to turn on a dime to serve a client, travel at a moment’s notice, and be available pretty much every evening and weekend. It can make for a grueling work life, but at the highest levels of accounting, law, investment banking and consulting firms, it is just the way things are.

Except for one dirty little secret: Some of the people ostensibly turning in those 80- or 90-hour workweeks, particularly men, may just be faking it.

Many of them were, at least, at one elite consulting firm studied by Erin Reid, a professor at Boston University’s Questrom School of Business. It’s impossible to know if what she learned at that unidentified consulting firm applies across the world of work more broadly. But her research, publishedin the academic journal Organization Science, offers a way to understand how the professional world differs between men and women, and some of the ways a hard-charging culture that emphasizes long hours above all can make some companies worse off.

Ms. Reid interviewed more than 100 people in the American offices of a global consulting firm and had access to performance reviews and internal human resources documents. At the firm there was a strong culture around long hours and responding to clients promptly.

“When the client needs me to be somewhere, I just have to be there,” said one of the consultants Ms. Reid interviewed. “And if you can’t be there, it’s probably because you’ve got another client meeting at the same time. You know it’s tough to say I can’t be there because my son had a Cub Scout meeting.”

Some people fully embraced this culture and put in the long hours, and they tended to be top performers. Others openly pushed back against it, insisting upon lighter and more flexible work hours, or less travel; they were punished in their performance reviews.

The third group is most interesting. Some 31 percent of the men and 11 percent of the women whose records Ms. Reid examined managed to achieve the benefits of a more moderate work schedule without explicitly asking for it.

They made an effort to line up clients who were local, reducing the need for travel. When they skipped work to spend time with their children or spouse, they didn’t call attention to it. One team on which several members had small children agreed among themselves to cover for one another so that everyone could have more flexible hours.

A male junior manager described working to have repeat consulting engagements with a company near enough to his home that he could take care of it with day trips. “I try to head out by 5, get home at 5:30, have dinner, play with my daughter,” he said, adding that he generally kept weekend work down to two hours of catching up on email.

Despite the limited hours, he said: “I know what clients are expecting. So I deliver above that.” He received a high performance review and a promotion.

What is fascinating about the firm Ms. Reid studied is that these people, who in her terminology were “passing” as workaholics, received performance reviews that were as strong as their hyper-ambitious colleagues. For people who were good at faking it, there was no real damage done by their lighter workloads.

It calls to mind the episode of “Seinfeld” in which George Costanza leaves his car in the parking lot at Yankee Stadium, where he works, and gets a promotion because his boss sees the car and thinks he is getting to work earlier and staying later than anyone else. (The strategy goes awry for him, and is not recommended for any aspiring partners in a consulting firm.)

Read the entire article here.

Real Magic

[tube]UibfDUPJAEU[/tube]

Literary, social, moral and philanthropic leadership. These are all very admirable qualities. We might strive to embody just one of these in our daily lives. Author J.K. Rowling seems to demonstrate all four. In her new book, Very Good Lives: The Fringe Benefits of Failure and the Importance of Imagination, published in April 2015, she distills advice from her self-effacing but powerful Harvard University commencement speech, delivered in 2008.

A couple of my favorite quotes:

Many prefer not to exercise their imaginations at all. They choose to remain comfortably within the bounds of their own experience, never troubling to wonder how it would feel to have been born other than they are.

Some failure in life is inevitable. It is impossible to live without failing at something, unless you live so cautiously that you might as well not lived at all.

Video: J.K. Rowling Harvard Commencement Speech, 2008. Courtesy of Harvard University.

The Biggest Threats to Democracy

Edward_SnowdenHistory reminds us of those critical events that pose threats to us on various levels: to our well being at a narrow level and to the foundations of our democracies at a much broader level. And, most of these existential threats seem to come from the outside: wars, terrorism, ethnic cleansing.

But it’s not quite that simple — the biggest threats come not from external sources of evil, but from within us. Perhaps the two most significant are our apathy and paranoia. Taken together they erode our duty to protect our democracy, and hand over ever-increasing power to those who claim to protect us. Thus, before the Nazi machine enslaved huge portions of Europe, the citizens of Germany allowed it to gain power; before Al-Qaeda and Isis and their terrorist look-a-likes gained notoriety local conditions allowed these groups to flourish. We are all complicit in our inaction — driven by indifference or fear, or both.

Two timely events serve to remind us of the huge costs and consequences of our inaction from apathy and paranoia. One from the not too distant past, and the other portends our future. First, it is Victory in Europe (VE) day, the anniversary of the Allied win in WWII, on May 8, 1945. Many millions perished through the brutal policies of the Nazi ideology and its instrument, the Wehrmacht, and millions more subsequently perished in the fight to restore moral order. Much of Europe first ignored the growing threat of the national socialists. As the threat grew, Europe continued to contemplate appeasement. Only later, as true scale of atrocities became apparent did leaders realize that the threat needed to be tackled head-on.

Second, a federal appeals court in the United States ruled on May 7, 2015 that the National Security Agency’s collection of millions of phone records is illegal. This serves to remind us of the threat that our own governments pose to our fundamental freedoms under the promise of continued comfort and security. For those who truly care about the fragility of democracy this is a momentous and rightful ruling. It is all the more remarkable that since the calamitous events of September 11, 2001 few have challenged this governmental overreach into our private lives: our phone calls, our movements, our internet surfing habits, our credit card history. We have seen few public demonstrations and all too little ongoing debate. Indeed, only through the recent revelations by Edward Snowden did the debate even enter the media cycle. And, the debate is only just beginning.

Both of these events show that only we, the people who are fortunate enough to live within a democracy, can choose a path that strengthens our governmental institutions and balances these against our fundamental rights. By corollary we can choose a path that weakens our institutions too. One path requires engagement and action against those who use fear to make us conform. The other path, often easier, requires that we do nothing, accept the status quo, curl up in the comfort of our cocoons and give in to fear.

So this is why the appeals court ruling is so important. While only three in number, the judges have established that our government has been acting illegally, yet supposedly on our behalf. While the judges did not terminate the unlawful program, they pointedly requested the US Congress to debate and then define laws that would be narrower and less at odds with citizens’ constitutional rights. So, the courts have done us all a great favor. One can only hope that this opens the eyes, ears and mouths of the apathetic and fearful so that they continuously demand fair and considered action from their elected representatives. Only then can we begin to make inroads against the real and insidious threats to our democracy — our apathy and our fear. And perhaps, also, Mr.Snowden can take a small helping of solace.

From the Guardian:

The US court of appeals has ruled that the bulk collection of telephone metadata is unlawful, in a landmark decision that clears the way for a full legal challenge against the National Security Agency.

A panel of three federal judges for the second circuit overturned an earlier rulingthat the controversial surveillance practice first revealed to the US public by NSA whistleblower Edward Snowden in 2013 could not be subject to judicial review.

But the judges also waded into the charged and ongoing debate over the reauthorization of a key Patriot Act provision currently before US legislators. That provision, which the appeals court ruled the NSA program surpassed, will expire on 1 June amid gridlock in Washington on what to do about it.

The judges opted not to end the domestic bulk collection while Congress decides its fate, calling judicial inaction “a lesser intrusion” on privacy than at the time the case was initially argued.

“In light of the asserted national security interests at stake, we deem it prudent to pause to allow an opportunity for debate in Congress that may (or may not) profoundly alter the legal landscape,” the judges ruled.

But they also sent a tacit warning to Senator Mitch McConnell, the Republican leader in the Senate who is pushing to re-authorize the provision, known as Section 215, without modification: “There will be time then to address appellants’ constitutional issues.”

“We hold that the text of section 215 cannot bear the weight the government asks us to assign to it, and that it does not authorize the telephone metadata program,” concluded their judgment.

“Such a monumental shift in our approach to combating terrorism requires a clearer signal from Congress than a recycling of oft?used language long held in similar contexts to mean something far narrower,” the judges added.

“We conclude that to allow the government to collect phone records only because they may become relevant to a possible authorized investigation in the future fails even the permissive ‘relevance’ test.

“We agree with appellants that the government’s argument is ‘irreconcilable with the statute’s plain text’.”

Read the entire story here.

Image: Edward Snowden. Courtesy of Wikipedia.

The Lone (And Paranoid) Star State

Flag_of_the_Republic_of_TexasThe Lone Star State continues to take pride in doing its own thing. After all it has a legacy to uphold since its very construction — that of fierce and outspoken independence. But, sometimes this leads to blind political arrogance, soon followed by growing paranoia.

You see, newly minted Texas Governor Greg Abbott has a theory that the US military is about to  put his state under the control of martial law. So, he has deployed the Texas State Guard to monitor any dubious federal activity and, one supposes, to curtail any attempts at a coup d’état. If I were Governor Abbott I would not overly trouble myself with a possible federal take-over of the state. After all, citizens will very soon be able to openly carry weapons in public — 20 million Texans “packing heat” [carrying a loaded gun, for those not versed in the subtle American vernacular] will surely deter the feds.

From NPR:

Since Gen. Sam Houston executed his famous retreat to glory to defeat the superior forces of Gen. Antonio Lopez de Santa Anna, Texas has been ground zero for military training. We have so many military bases in the Lone Star State we could practically attack Russia.

So when rookie Texas Gov. Greg Abbott announced he was ordering the Texas State Guard to monitor a Navy SEAL/Green Beret joint training exercise, which was taking place in Texas and several other states, everybody here looked up from their iPhones. What?

It seems there is concern among some folks that this so-called training maneuver is just a cover story. What’s really going on? President Obama is about to use Special Forces to put Texas under martial law.

Let’s walk over by the fence where nobody can hear us, and I’ll tell you the story.

You see, there are these Wal-Marts in West Texas that supposedly closed for six months for “renovation.” That’s what they want you to believe. The truth is these Wal-Marts are going to be military guerrilla-warfare staging areas and FEMA processing camps for political prisoners. The prisoners are going to be transported by train cars that have already been equipped with shackles.

Don’t take my word for it. That comes directly from a Texas Ranger, who seems pretty plugged in, if you ask me. You and I both know President Obama has been waiting a long time for this, and now it’s happening. It’s a classic false flag operation. Don’t pay any attention to the mainstream media; all they’re going to do is lie and attack everyone who’s trying to tell you the truth.

Did I mention the ISIS terrorists? They’ve come across the border and are going to hit soft targets all across the Southwest. They’ve set up camp a few miles outside of El Paso.

That includes a Mexican army officer and Mexican federal police inspector. Not sure what they’re doing there, but probably nothing good. That’s why the Special Forces guys are here, get it? To wipe out ISIS and impose martial law. So now you know, whaddya say we get back to the party and grab another beer?

It’s true that the paranoid worldview of right-wing militia types has remarkable stamina. But that’s not news.

What is news is that there seem to be enough of them in Texas to influence the governor of the state to react — some might use the word pander — to them.

That started Monday when a public briefing by the Army in Bastrop County, which is just east of Austin, got raucous. The poor U.S. Army colonel probably just thought he was going to give a regular briefing, but instead 200 patriots shouted him down, told him he was a liar and grilled him about the imminent federal takeover of Texas and subsequent imposition of martial law.

“We just want to make sure our guys are trained. We want to hone our skills,” Lt. Col. Mark Listoria tried to explain in vain.

One wonders what Listoria was thinking to himself as he walked to his car after two hours of his life he’ll never get back. God bless Texas? Maybe not.

The next day Abbott decided he had to take action. He announced that he was going to ask the Texas State Guard to monitor Operation Jade Helm from start to finish.

“It is important that Texans know their safety, constitutional rights, private property rights and civil liberties will not be infringed upon,” Abbott said.

The idea that the Yankee military can’t be trusted down here has a long and rich history in Texas. But that was a while back. Abbott’s proclamation that he was going to keep his eye on these Navy SEAL and Green Beret boys did rub some of our leaders the wrong way.

Former Texas Lt. Gov. David Dewhurst tried to put it in perspective for outsiderswhen he explained, “Unfortunately, some Texans have projected their legitimate concerns about the competence and trustworthiness of President Barack Obama on these noble warriors. This must stop.”

Another former Republican politician was a bit more pointed.

“Your letter pandering to idiots … has left me livid,” former state Rep. Todd Smith wrote Abbott. “I am horrified that I have to choose between the possibility that my Governor actually believes this stuff and the possibility that my Governor doesn’t have the backbone to stand up to those who do.”

Read the entire story here.

Image: The “Burnet Flag,” used from 1836 to 1839 as the national flag of the Republic of Texas until it was replaced by the currently used “Lone Star Flag.” Public Domain. Courtesy of Wikipedia.

The Six Percent

Google-search-trailer-park

According to the last US census, around 6 percent of the population — that’s 20 million people — live in trailer parks. This is a startling and significant number, and it continues to grow; economic inequality and financial hardship hits those on the lowest rungs of the socio-economic ladder the hardest. And, of course, this means that trailer park owners, typically people at the other end of the economic ladder, are salivating over increased share, higher rents, greater revenue and better profits.

From the Guardian:

The number one rule is stated twice, once in the classroom and once on the bus: “Don’t make fun of the residents.” Welcome to Mobile Home University, a three-day, $2,000 “boot camp” that teaches people from across the US how to make a fortune by buying up trailer parks.

Trailer parks are big and profitable business – particularly after hundreds of thousands of Americans who lost their homes in the financial crisis created a huge demand for affordable housing. According to US Census figures, more than 20 million people, or 6% of the population, live in trailer parks.

It is a market that has not been lost on some of the country’s richest and most high-profile investors. Sam Zell’s Equity LifeStyle Properties (ELS) is the largest mobile home park owner in America, with controlling interests in nearly 140,000 parks. In 2014, ELS made $777m in revenue, helping boost Zell’s near-$5bn fortune.

Warren Buffett, the nation’s second-richest man with a $72bn fortune, owns the biggest mobile home manufacturer in the US, Clayton Homes, and the two biggest mobile home lenders, 21st Mortgage Corporation and Vanderbilt Mortgage and Finance Company. Buffett’s trailer park investments will feature heavily at his annual meeting this weekend, which will be attended by more than 40,000 shareholders in Omaha.

Such success is prompting ordinary people with little or no experience to try to follow in their footsteps.

On a bright Saturday morning, under the Floridian sun, Frank Rolfe, the multimillionaire co-founder of Mobile Home University who is the nation’s 10th-biggest trailer park owner, conducts a tour of parks around Orlando, Florida. A busload of hopefuls, ranging in age from early 20s to late 70s, hangs on his every word.

As the tour approaches its first stop, Rolfe repeats a warning which earlier flashed on to a screen in a conference room of the Orlando airport Hyatt hotel: “When we are on the property, don’t make fun of the residents, or say things that can get us in trouble or offend anyone. I once had a bank come to a mobile home park and say in front of my manager, ‘Only a white trash idiot would live in a trailer.’”

Then comes a second, more unexpected warning: “Now, guys, I’ve got to tell you this park, I believe, is a sex-offender park. Everyone in here is a sex offender. I could be wrong, we’re going to find out, but I think that’s the deal on this one. So stay together as a herd.”

He’s not wrong. Signs at the entrance to Lake Shore Village, on the north-eastern outskirts of Orlando, warn: “Adults only. No Children.” The park is described on the owner’s business cards as “sex offender housing” and a “habitat for offenders”.

On the forecourt the owner, Lori Lee, tells Rolfe’s students she dedicated the park to sex offenders 20 years ago – and hasn’t looked back.

“We were a family park when we first started. [But] about 20 years ago, I couldn’t get on the property because a drug dealer had separated from his girlfriend in the park across the street … and there was a long line of cars because she was undercutting her boyfriend.”

Lee, 70, says she was advised that if she took in sex offenders the drug dealers would leave. “So, I started taking in sex offenders, and I have a very clean property. Sex offenders are watched by the news media, the TV, the sheriff’s department, probation, the department of corrections … so when they are in there, the drug dealers and the other people don’t like to be around.”

Sex offenders have been good for Lee financially, with park occupancy running at “1,000%”. She rents trailer pad spots for about $325 a month. The trailers are either owned by the tenant or rented from a third party. Many trailers are divided into three bedrooms, for which tenants are charged $500 a month per room.

Lee claims she was once offered $5m for Lake Shore Park, which is home to about 50 trailers.

“Last year I bought a park down the street, got rid of all the families, the drug dealers, the prostitutes, and brought in convicted felons. And then I bought the property across the way,” she says. “Once you’re into it and you’re making money it’s easy to say, ‘One more, one more’.”

She has her eyes on a fourth park, “but then I’m through. I’m 70 years old and I don’t want to own any more”.

Asked by an eager investor how regularly tenants leave her parks, Lee says: “When they die. [They] stay forever, they have no place to go.”

Lee’s strategy impresses Rolfe’s students.

“I thought it was a brilliant idea, brilliant,” says Mitch Huhem, who is looking to buy a trailer park with his wife, Deborah. “These people need a place to live, and they don’t want to mess around.

“They’ve got to live somewhere, so you combine them in a certain place. They don’t go out to hurt people. I think it’s a community service, because if not they will be in your neighbourhood. Now they’re all in one place, you can watch them all in one place. And they pay well and won’t mess things up. I mean, why would you not? I think it’s a brilliant idea.”

Rolfe, who with his business partner Dave Reynolds owns about 160 parks across the midwest, is unsure about taking in sex offenders. But he is certain Lee could make even more money if she raised the rent.

“She could definitely raise the rent,” he says, as the tour group gets back on the bus. “She’s got a definite niche, but she is definitely under the Orlando rent; she might be under by $100 a month, maybe.

“Raising the rent is typically part of the day one purchase, because often the ‘mom and pop’ [previous, family-run owner of a park] has not raised the rent in years so it’s far below market.

“[The rents] do not go down, that’s one thing that’s a safe bet in the trailer park world. Our rents do not go down.

“We traditionally raise our rents by an average of 10% a year or something like that, and it’s pretty much true for the industry. Our world record [rent increase] went from $125 to $275 in one month.”

Rolfe, who bought a pistol for personal security when he bought his first park, 20 years ago, says he sent a letter to every tenant at that park in Grapevine, Texas, telling them the rent was going to more than double but was still below the market rate of $325.

“If you don’t like this or you think you can do better, here’s a list of all the other parks in Grapevine and a list of the owners,” he said in the letter. “Go ahead, call them if you want to move. How many customers do you think we lost? Zero. Where were they going to go?”

Rolfe, who started Mobile Home University seven years ago and now runs boot camps every couple months in cities across the country, tells his students they can easily increase the rent even at parks that are already charging market rates, because there is so much demand for affordable housing and local authorities are very reluctant to grant permission for new parks.

He quotes US government statistics showing that in 2013, 39% of Americans earned less than $20,000 – less than the government’s poverty threshold incomeof $20,090 for a three-person household.

“That’s huge. No one believes that number – people say: ‘You’re crazy, this is America, everyone is rich.’ [Being on an income of $20,000 or less] means you have a budget of about $500 a month for your housing, but the average two-bedroom apartment is $1,109 a month. There’s not a lot you can do.”

Kenneth Staton, a 58-year-old, disabled tenant at a nearby (non-sex offender) trailer park, knows it.

“It’s a profitable investment, but raising the rent is what hurts because people like myself, we’re on a fixed income and we can only afford so much,” he says, on the dirt road outside his trailer. “I’m on disability, and I go around and collect aluminium cans to see myself through a little bit.”

Asked if he thinks he will see out his days in the trailer park, Staton says: “It kinda looks like it, unless I can find a house somewhere I can afford. I only get $830 a month; $500 goes for rent, about $95 goes for electric. It don’t leave much to live on. Luckily, I get food stamps.”

Read the entire article here.

Image courtesy of Google Search.

Soviet Optics

Krakow-poland-1988

The heavy hand of the Soviet Union left untold scars on the populations of many Eastern European nations. Millions of citizens were repressed, harmed, spied-upon and countless disappeared. The Soviets and their socialist puppet governments also fostered many decades of centrally-planed austerity that created generations of impoverished — though not the ruling elites, of course. Nonetheless independent shopkeepers would try to put a brave face on their lack of a market for most goods and services — little supply and limited demand.

Photographer David Hlynsky spend several years in Eastern Europe, following the fall of the Berlin Wall, documenting the waning of the Soviet era. His book, Window-Shopping Through the Iron Curtainfeaturing many absurdly bleak views of consumer-minimalism [not necessarily a bad thing], was published in February 2015.

Read more from The Guardian’s article here.

Image: Three Loaves of Bread, Krakow, Poland, 1988. Courtesy of David Hlynsky.

Baroness Thatcher and the Media Baron

The cozy yet fraught relationship between politicians and powerful figures in the media has been with us since the first days of newsprint. It’s a delicate symbiosis of sorts — the politician needs the media magnate to help acquire and retain power; the media baron needs the politician to shape and centralize it. The underlying motivations seem similar for both parties, hence the symbiosis — self-absorbtion, power, vanity.

So, it comes as no surprise to read intimate details of the symbiotic Rupert Murdoch / Margaret Thatcher years. Prime minister Thatcher would sometimes actively, but often surreptitiously, support Murdoch’s megalomaniacal desire to corner the UK (and global) media, while Murdoch would ensure his media channeled appropriately Thatcher-friendly news, spin and op-ed. But the Thatcher-Murdoch story is just one of the latest in a long line of business deals between puppet and puppet-master [you may decide which is which, dear reader]. Over the last hundred years we’ve had William Randolph Hearst and Roosevelt, Lloyd George and Northcliffe, Harold Wilson and Robert Maxwell, Baldwin and Beaverbrook.

Thomas Jefferson deplored newspapers — seeing them as vulgar and cancerous. His prescient analysis of the troubling and complex relationship between the news and politics is just as valid today, “an evil for which there is no remedy; our liberty depends on the freedom of the press, and this cannot be limited without being lost”.

Yet for all the grievous faults and dubious shenanigans of the brutish media barons and their fickle political spouses, the Thatcher-Murdoch story is perhaps not as sinister as one might first think. We now live in an age where faceless corporations and billionaires broker political power and shape policy behind mountains of money, obfuscated institutions and closed doors. This is far more troubling for our democracies. I would rather fight an evil that has a face.

From the Guardian:

The coup that transformed the relationship between British politics and journalism began at a quiet Sunday lunch at Chequers, the official country retreat of the prime minister, Margaret Thatcher. She was trailing in the polls, caught in a recession she had inherited, eager for an assured cheerleader at a difficult time. Her guest had an agenda too. He was Rupert Murdoch, eager to secure her help in acquiring control of nearly 40% of the British press.

Both parties got what they wanted.

The fact that they met at all, on 4 January 1981, was vehemently denied for 30 years. Since their lie was revealed, it has been possible to uncover how the greatest extension of monopoly power in modern press history was planned and executed with such furtive brilliance.

All the wretches in the subsequent hacking sagas – the predators in the red-tops, the scavengers and sleaze merchants, the blackmailers and bribers, the liars, the bullies, the cowed politicians and the bent coppers – were but the detritus of a collapse of integrity in British journalism and political life. At the root of the cruelties and extortions exposed in the recent criminal trials at the Old Bailey, was Margaret Thatcher’s reckless engorgement of the media power of her guest that January Sunday. The simple genesis of the hacking outrages is that Murdoch’s News International came to think it was above the law, because it was.

Thatcher achieved much as a radical prime minister confronted by political turmoil and economic torpor. So did Murdoch, in his liberation of British newspapers from war with the pressroom unions, and by wresting away the print unions’ monopoly of access to computer technology. I applauded his achievements, and still do, as I applauded many of Thatcher’s initiatives when I chaired the editorial boards of the Sunday Times (1967-81) and then the Times (1981-2). It is sad that her successes are stained by recent evidence of her readiness to ensure sunshine headlines for herself in the Murdoch press (especially when it was raining), at a heavy cost to the country. She enabled her guest to avoid a reference to the Monopolies and Mergers Commission, even though he already owned the biggest-selling daily newspaper, the Sun, and the biggest selling Sunday newspaper, the News of the World, and was intent on acquiring the biggest-selling quality weekly, the Sunday Times, and its stablemate, the Times. 

 Times Newspapers had long cherished their independence. In 1966, when the Times was in financial difficulty, the new owner who came to the rescue, Lord Roy Thomson of Fleet, promised to sustain it as an independent non-partisan newspaper – precisely how he had conducted the profitable Sunday Times. Murdoch was able to acquire both publications in 1981 only because he began making solemn pledges that he would maintain the tradition of independence. He broke every one of those promises in the first years. His breach of the undertakings freely made for Times Newspapers was a marked contrast with the independent journalism we at the Sunday Times (and William Rees-Mogg at the Times) had enjoyed under the principled ownership of the Thomson family. Thatcher was a vital force in reviving British competitiveness, but she abetted a concentration of press power that became increasingly arrogant and careless of human dignity in ways that would have appalled her, had she remained in good health long enough to understand what her actions had wrought.

Documents released by the Thatcher Archive Trust, now housed at Churchill College, Cambridge, give the lie to a litany of Murdoch-Thatcher denials about collusion during the bidding for Times Newspapers. They also expose a crucial falsehood in the seventh volume of The History of the Times: The Murdoch Years – the official story of the newspaper from 1981-2002, published in 2005 by the Murdoch-owned HarperCollins. In it Graham Stewart wrote, in all innocence, that Murdoch and Thatcher “had no communication whatsoever during the period in which the Times bid and presumed referral to the Monopolies and Mergers Commission was up for discussion”.

Read the entire story here.

 

Marketing of McGod

google-search-church-logos

Many churches now have their own cool logos. All of the large or mega-churches have their own well-defined brands and well-oiled marketing departments. Clearly, God is not doing enough to disseminate his (or her) message — God needs help from ad agencies and marketing departments. Modern day evangelism is not only a big business, it’s now a formalized business process, with key objectives, market share drivers, growth strategies, metrics and key performance indicators (KPI) — just like any other corporate franchise.

But some Christians believe that there is more (or, actually, less) to their faith than neo-evangelical brands like Vine, Gather, Vertical or Prime. So, some are shunning these houses of “worshipfotainment” [my invention, dear reader] with high-production values and edgy programming; they are forgoing mega-screens with Jesus-powerpoint and heavenly lasers, lattes in the lobby and hip Christian metal. A millennial tells his story of disillusionment with the McChurch — its evangelical shallowness and exclusiveness.

From the Washington Post:

Bass reverberates through the auditorium floor as a heavily bearded worship leader pauses to invite the congregation, bathed in the light of two giant screens, to tweet using #JesusLives. The scent of freshly brewed coffee wafts in from the lobby, where you can order macchiatos and purchase mugs boasting a sleek church logo. The chairs are comfortable, and the music sounds like something from the top of the charts. At the end of the service, someone will win an iPad.

This, in the view of many churches, is what millennials like me want. And no wonder pastors think so. Church attendance has plummeted among young adults. In the United States, 59 percent of people ages 18 to 29 with a Christian background have, at some point, dropped out. According to the Pew Forum on Religion & Public Life, among those of us who came of age around the year 2000, a solid quarter claim no religious affiliation at all, making my generation significantly more disconnected from faith than members of Generation X were at a comparable point in their lives and twice as detached as baby boomers were as young adults.

In response, many churches have sought to lure millennials back by focusing on style points: cooler bands, hipper worship, edgier programming, impressive technology. Yet while these aren’t inherently bad ideas and might in some cases be effective, they are not the key to drawing millennials back to God in a lasting and meaningful way. Young people don’t simply want a better show. And trying to be cool might be making things worse.

 You’re just as likely to hear the words “market share” and “branding” in church staff meetings these days as you are in any corporate office. Megachurches such as Saddleback in Lake Forest, Calif., and Lakewood in Houston have entire marketing departments devoted to enticing new members. Kent Shaffer of ChurchRelevance.com routinely ranks the best logos and Web sites and offers strategic counsel to organizations like Saddleback and LifeChurch.tv.

Increasingly, churches offer sermon series on iTunes and concert-style worship services with names like “Vine” or “Gather.” The young-adult group at Ed Young’s Dallas-based Fellowship Church is called Prime, and one of the singles groups at his father’s congregation in Houston is called Vertical. Churches have made news in recent years for giving away tablet computers , TVs and even cars at Easter. Still, attendance among young people remains flat.

Recent research from Barna Group and the Cornerstone Knowledge Network found that 67 percent of millennials prefer a “classic” church over a “trendy” one, and 77 percent would choose a “sanctuary” over an “auditorium.” While we have yet to warm to the word “traditional” (only 40 percent favor it over “modern”), millennials exhibit an increasing aversion to exclusive, closed-minded religious communities masquerading as the hip new places in town. For a generation bombarded with advertising and sales pitches, and for whom the charge of “inauthentic” is as cutting an insult as any, church rebranding efforts can actually backfire, especially when young people sense that there is more emphasis on marketing Jesus than actually following Him. Millennials “are not disillusioned with tradition; they are frustrated with slick or shallow expressions of religion,” argues David Kinnaman, who interviewed hundreds of them for Barna Group and compiled his research in “You Lost Me: Why Young Christians Are Leaving Church .?.?. and Rethinking Faith.”

My friend and blogger Amy Peterson put it this way: “I want a service that is not sensational, flashy, or particularly ‘relevant.’ I can be entertained anywhere. At church, I do not want to be entertained. I do not want to be the target of anyone’s marketing. I want to be asked to participate in the life of an ancient-future community.”

Millennial blogger Ben Irwin wrote: “When a church tells me how I should feel (‘Clap if you’re excited about Jesus!’), it smacks of inauthenticity. Sometimes I don’t feel like clapping. Sometimes I need to worship in the midst of my brokenness and confusion — not in spite of it and certainly not in denial of it.”

When I left church at age 29, full of doubt and disillusionment, I wasn’t looking for a better-produced Christianity. I was looking for a truer Christianity, a more authentic Christianity: I didn’t like how gay, lesbian, bisexual and transgender people were being treated by my evangelical faith community. I had questions about science and faith, biblical interpretation and theology. I felt lonely in my doubts. And, contrary to popular belief, the fog machines and light shows at those slick evangelical conferences didn’t make things better for me. They made the whole endeavor feel shallow, forced and fake.

Read the entire story here.

Spam, Spam, Spam: All Natural

Google-search-natural-junk-food

Parents through the ages have often decried the mangling of their mother tongue by subsequent generations. Language is fluid after all, particularly English, and our youth constantly add their own revisions to carve a divergent path from their elders. But, the focus of our disdain for the ongoing destruction of our linguistic heritage should really be corporations and their hordes of marketeers and lawyers. Take the once simple and meaningful word “natural”. You’ll see its oxymoronic application each time you stroll along the aisle at your grocery store: one hundred percent natural fruit roll-ups; all natural chicken rings; completely natural corn-dogs; totally naturally flavored cheese puffs. The word — natural — has become meaningless.

From NYT:

It isn’t every day that the definition of a common English word that is ubiquitous in common parlance is challenged in federal court, but that is precisely what has happened with the word “natural.” During the past few years, some 200 class-action suits have been filed against food manufacturers, charging them with misuse of the adjective in marketing such edible oxymorons as “natural” Cheetos Puffs, “all-natural” Sun Chips, “all-natural” Naked Juice, “100 percent all-natural” Tyson chicken nuggets and so forth. The plaintiffs argue that many of these products contain ingredients — high-fructose corn syrup, artificial flavors and colorings, chemical preservatives and genetically modified organisms — that the typical consumer wouldn’t think of as “natural.”

Judges hearing these cases — many of them in the Northern District of California — have sought a standard definition of the adjective that they could cite to adjudicate these claims, only to discover that no such thing exists.

Something in the human mind, or heart, seems to need a word of praise for all that humanity hasn’t contaminated, and for us that word now is “natural.” Such an ideal can be put to all sorts of rhetorical uses. Among the antivaccination crowd, for example, it’s not uncommon to read about the superiority of something called “natural immunity,” brought about by exposure to the pathogen in question rather than to the deactivated (and therefore harmless) version of it made by humans in laboratories. “When you inject a vaccine into the body,” reads a post on an antivaxxer website, Campaign for Truth in Medicine, “you’re actually performing an unnatural act.” This, of course, is the very same term once used to decry homosexuality and, more recently, same-sex marriage, which the Family Research Council has taken to comparing unfavorably to what it calls “natural marriage.”

So what are we really talking about when we talk about natural? It depends; the adjective is impressively slippery, its use steeped in dubious assumptions that are easy to overlook. Perhaps the most incoherent of these is the notion that nature consists of everything in the world except us and all that we have done or made. In our heart of hearts, it seems, we are all creationists.

In the case of “natural immunity,” the modifier implies the absence of human intervention, allowing for a process to unfold as it would if we did nothing, as in “letting nature take its course.” In fact, most of medicine sets itself against nature’s course, which is precisely what we like about it — at least when it’s saving us from dying, an eventuality that is perhaps more natural than it is desirable.

Yet sometimes medicine’s interventions are unwelcome or go overboard, and nature’s way of doing things can serve as a useful corrective. This seems to be especially true at the beginning and end of life, where we’ve seen a backlash against humanity’s technological ingenuity that has given us both “natural childbirth” and, more recently, “natural death.”

This last phrase, which I expect will soon be on many doctors’ lips, indicates the enduring power of the adjective to improve just about anything you attach it to, from cereal bars all the way on up to dying. It seems that getting end-of-life patients and their families to endorse “do not resuscitate” orders has been challenging. To many ears, “D.N.R.” sounds a little too much like throwing Grandpa under the bus. But according to a paper in The Journal of Medical Ethics, when the orders are reworded to say “allow natural death,” patients and family members and even medical professionals are much more likely to give their consent to what amounts to exactly the same protocols.

The word means something a little different when applied to human behavior rather than biology (let alone snack foods). When marriage or certain sexual practices are described as “natural,” the word is being strategically deployed as a synonym for “normal” or “traditional,” neither of which carries nearly as much rhetorical weight. “Normal” is by now too obviously soaked in moral bigotry; by comparison, “natural” seems to float high above human squabbling, offering a kind of secular version of what used to be called divine law. Of course, that’s exactly the role that “natural law” played for America’s founding fathers, who invoked nature rather than God as the granter of rights and the arbiter of right and wrong.

Read the entire article here.

Image courtesy of Google Search.

 

The Rich and Powerful Live by Different Rules

Bradley_ManningNever has there been such a wonderful example of blatant utter hypocrisy. This time from the United States Department of Justice. It would be refreshing to convey to our leaders that not only do “Black Lives Matter”, “Less Privileged Lives Matter” as well.

Former director of the CIA no less, and ex-four star general David Petraeus copped a mere two years of probation and a $100,000 fine for leaking classified information to his biographer. Chelsea Manning, formerly Bradley Manning, intelligence analyst and ex-army private, was sentenced to 35 years in prison in 2013 for disclosing classified documents to WikiLeaks.

And, there are many other similar examples.

DCIA David PetraeusWe wince when hearing of oligarchic corruption and favoritism in other nations, such as Russia and China. But, in this country it goes by the euphemism known as “justice” so it must be OK.

From arstechnica:

Yesterday [April 23, 2015], former CIA Director David Petraeus was handed two years of probation and a $100,000 fine after agreeing to a plea deal that ends in no jail time for leaking classified information to Paula Broadwell, his biographer and lover.

“I now look forward to moving on with the next phase of my life and continuing to serve our great nation as a private citizen,” Petraeus said outside the federal courthouse in Charlotte, North Carolina on Thursday.

Lower-level government leakers have not, however, been as likely to walk out of a courthouse applauding the US as Petraeus did. Trevor Timm, executive director of the Freedom of the Press Foundation, called the Petraeus plea deal a “gross hypocrisy.”

“At the same time as Petraeus got off virtually scot-free, the Justice Department has been bringing the hammer down upon other leakers who talk to journalists—sometimes for disclosing information much less sensitive than Petraeus did,” he said.

The Petraeus sentencing came days after the Justice Department demanded (PDF) up to a 24-year-term for Jeffrey Sterling, a former CIA agent who leaked information to a Pulitzer Prize-winning writer about a botched mission to sell nuclear plans to Iran in order to hinder its nuclear-weapons progress.

“A substantial sentence in this case would send an appropriate and much needed message to all persons entrusted with the handling of classified information, i.e., that intentional breaches of the laws governing the safeguarding of national defense information will be pursued aggressively, and those who violate the law in this manner will be tried, convicted, and punished accordingly,” the Justice Department argued in Sterling’s case this week.

The Daily Beast sums up the argument that the Petraeus deal involves a double standard by noting other recent penalties for lower-level leakers:

“Chelsea Manning, formerly Bradley Manning, was sentenced to 35 years in prison in 2013 for disclosing classified documents to WikiLeaks. Stephen Jin-Woo Kim, a former State Department contractor, entered a guilty plea last year to one felony count of disclosing classified information to a Fox News reporter in February 2014. He was sentenced to 13 months in prison. On Monday, prosecutors urged a judge to sentence Jeffrey Sterling, a former CIA officer, to at least 20 years in prison for leaking classified plans to sabotage Iran’s nuclear-weapons program to a New York Times reporter. Sterling will be sentenced next month. And former CIA officer John C. Kiriakou served 30 months in federal prison after he disclosed the name of a covert operative to a reporter. He was released in February and is finishing up three months of house arrest.”

The information Petraeus was accused of leaking, according to the original indictment, contained “classified information regarding the identities of covert officers, war strategy, intelligence capabilities and mechanisms, diplomatic discussions, quotes and deliberative discussions from high-level National Security Council meetings.” The leak also included “discussions with the president of the United States.”

The judge presiding over the case, US Magistrate Judge David Keesler, increased the government’s recommended fine of $40,000 to $100,000 because of Petraeus’ “grave but uncharacteristic error in judgement.”

Read the entire story here.

Images: Four-Star General David Petraeus; Private Chelsea Manning. Courtesy of Wikipedia.

Belief and the Falling Light

[tube]dpmXyJrs7iU[/tube]

Many of us now accept that lights falling from the sky are rocky interlopers from the asteroid clouds within our solar system, rather than visiting angels or signs from an angry (or mysteriously benevolent) God. New analysis of the meteor that overflew Chelyabinsk in Russia in 2013 suggests that one of the key founders of Christianity may have witnessed a similar natural phenomenon around two thousand years ago. However, at the time, Saul (later to become Paul the evangelist) interpreted the dazzling light on the road to Damascus — Acts of the Apostles, New Testament — as a message from a Christian God. The rest, as they say, is history. Luckily, recent scientific progress now means that most of us no longer establish new religious movements based on fireballs in the sky. But, we are awed nonetheless.

From the New Scientist:

Nearly two thousand years ago, a man named Saul had an experience that changed his life, and possibly yours as well. According to Acts of the Apostles, the fifth book of the biblical New Testament, Saul was on the road to Damascus, Syria, when he saw a bright light in the sky, was blinded and heard the voice of Jesus. Changing his name to Paul, he became a major figure in the spread of Christianity.

William Hartmann, co-founder of the Planetary Science Institute in Tucson, Arizona, has a different explanation for what happened to Paul. He says the biblical descriptions of Paul’s experience closely match accounts of the fireball meteor seen above Chelyabinsk, Russia, in 2013.

Hartmann has detailed his argument in the journal Meteoritics & Planetary Science (doi.org/3vn). He analyses three accounts of Paul’s journey, thought to have taken place around AD 35. The first is a third-person description of the event, thought to be the work of one of Jesus’s disciples, Luke. The other two quote what Paul is said to have subsequently told others.

“Everything they are describing in those three accounts in the book of Acts are exactly the sequence you see with a fireball,” Hartmann says. “If that first-century document had been anything other than part of the Bible, that would have been a straightforward story.”

But the Bible is not just any ancient text. Paul’s Damascene conversion and subsequent missionary journeys around the Mediterranean helped build Christianity into the religion it is today. If his conversion was indeed as Hartmann explains it, then a random space rock has played a major role in determining the course of history (see “Christianity minus Paul”).

That’s not as strange as it sounds. A large asteroid impact helped kill off the dinosaurs, paving the way for mammals to dominate the Earth. So why couldn’t a meteor influence the evolution of our beliefs?

“It’s well recorded that extraterrestrial impacts have helped to shape the evolution of life on this planet,” says Bill Cooke, head of NASA’s Meteoroid Environment Office in Huntsville, Alabama. “If it was a Chelyabinsk fireball that was responsible for Paul’s conversion, then obviously that had a great impact on the growth of Christianity.”

Hartmann’s argument is possible now because of the quality of observations of the Chelyabinsk incident. The 2013 meteor is the most well-documented example of larger impacts that occur perhaps only once in 100 years. Before 2013, the 1908 blast in Tunguska, also in Russia, was the best example, but it left just a scattering of seismic data, millions of flattened trees and some eyewitness accounts. With Chelyabinsk, there is a clear scientific argument to be made, says Hartmann. “We have observational data that match what we see in this first-century account.”

Read the entire article here.

Video: Meteor above Chelyabinsk, Russia in 2013. Courtesy of Tuvix72.

Endless Political Campaigning

US-politicians

The great capitalist market has decided — endless political campaigning in the United States is beneficial. If you think the presidential campaign to elect the next leader in 2016 began sometime last year you are not mistaken. In fact, it really does seem that political posturing for the next election often begins before the current one is even decided. We all complain: too many ads, too much negativity, far too much inanity and little substance. Yet, we allow the process to continue, and to grow in scale. Would you put up with a political campaign that lasts a mere 38 days? The British seem to do it. But, then again, the United States is so much more advanced, right?

From WSJ:

On March 23, Ted Cruz announced he is running for president in a packed auditorium at Liberty University in Lynchburg, Va. On April 7, Rand Paul announced he is running for president amid the riverboat décor of the Galt House hotel in Louisville, Ky. On April 12, Hillary Clinton announced she is running for president in a brief segment of a two-minute video. On April 13, Marco Rubio announced he is running before a cheering crowd at the Freedom Tower in Miami. And these are just the official announcements.

Jeb Bush made it known in December that he is interested in running. Scott Walker’s rousing speech at the Freedom Summit in Des Moines, Iowa, on Jan. 24 left no doubt that he will enter the race. Chris Christie’s appearance in New Hampshire last week strongly suggests the same. Previous presidential candidates Mike Huckabee,Rick Perry and Rick Santorum seem almost certain to run. Pediatric surgeon Ben Carson is reportedly ready to announce his run on May 4 at the Detroit Music Hall.

With some 570 days left until Election Day 2016, the race for president is very much under way—to the dismay of a great many Americans. They find the news coverage of the candidates tiresome (what did Hillary order at Chipotle?), are depressed by the negative campaigning that is inevitable in an adversarial process, and dread the onslaught of political TV ads. Too much too soon!

They also note that other countries somehow manage to select their heads of government much more quickly. The U.K. has a general election campaign going on right now. It began on March 30, when the queen, on the advice of the prime minister, dissolved Parliament, and voting will take place on May 7. That’s 38 days later. Britons are complaining that the electioneering goes on too long.

American presidential campaigns did not always begin so soon, but they have for more than a generation now. As a young journalist, Sidney Blumenthal (in recent decades a consigliere to the Clintons) wrote quite a good book titled “The Permanent Campaign.” It was published in 1980. Mr. Blumenthal described what was then a relatively new phenomenon.

When Jimmy Carter announced his candidacy for president in January 1975, he was not taken particularly seriously. But his perseverance paid off, and he took the oath of office two years later. His successors—Ronald Reagan, George H.W. Bush and Bill Clinton—announced their runs in the fall before their election years, although they had all been busy assembling campaigns before that. George W. Bush announced in June 1999, after the adjournment of the Texas legislature. Barack Obama announced in February 2007, two days before Lincoln’s birthday, in Lincoln’s Springfield, Ill. By that standard, declared candidates Mr. Cruz, Mr. Paul, Mrs. Clinton and Mr. Rubio got a bit of a late start.

Why are American presidential campaigns so lengthy? And is there anything that can be done to compress them to a bearable timetable?

One clue to the answers: The presidential nominating process, the weakest part of our political system, is also the one part that was not envisioned by the Founding Fathers. The framers of the Constitution created a powerful presidency, confident (justifiably, as it turned out) that its first incumbent, George Washington, would set precedents that would guide the republic for years to come.

But they did not foresee that even in Washington’s presidency, Americans would develop political parties, which they abhorred. The Founders expected that later presidents would be chosen, usually by the House of Representatives, from local notables promoted by different states in the Electoral College. They did not expect that the Federalist and Republican parties would coalesce around two national leaders—Washington’s vice president, John Adams, and Washington’s first secretary of state, Thomas Jefferson—in the close elections of 1796 and 1800.

The issue then became: When a president followed George Washington’s precedent and retired after two terms, how would the parties choose nominees, in a republic that, from the start, was regionally, ethnically and religiously diverse?

Read the entire story here.

Image courtesy of Google Search.

Religious Dogma and DNA

Despite ongoing conflicts around the global that are fueled or governed by religious fanaticism it is entirely plausible that our general tendency to supernatural belief is encoded in our DNA. Of course this does not mean that a God or that various gods exist, it merely implies that over time natural selection generally favored those who believed in deities over those did not. We are such complex and contradictory animals.

From NYT:

Most of us find it mind-boggling that some people seem willing to ignore the facts — on climate change, on vaccines, on health care — if the facts conflict with their sense of what someone like them believes. “But those are the facts,” you want to say. “It seems weird to deny them.”

And yet a broad group of scholars is beginning to demonstrate that religious belief and factual belief are indeed different kinds of mental creatures. People process evidence differently when they think with a factual mind-set rather than with a religious mind-set. Even what they count as evidence is different. And they are motivated differently, based on what they conclude. On what grounds do scholars make such claims?

First of all, they have noticed that the very language people use changes when they talk about religious beings, and the changes mean that they think about their realness differently. You do not say, “I believe that my dog is alive.” The fact is so obvious it is not worth stating. You simply talk in ways that presume the dog’s aliveness — you say she’s adorable or hungry or in need of a walk. But to say, “I believe that Jesus Christ is alive” signals that you know that other people might not think so. It also asserts reverence and piety. We seem to regard religious beliefs and factual beliefs with what the philosopher Neil Van Leeuwen calls different “cognitive attitudes.”

Second, these scholars have remarked that when people consider the truth of a religious belief, what the belief does for their lives matters more than, well, the facts. We evaluate factual beliefs often with perceptual evidence. If I believe that the dog is in the study but I find her in the kitchen, I change my belief. We evaluate religious beliefs more with our sense of destiny, purpose and the way we think the world should be. One study found that over 70 percent of people who left a religious cult did so because of a conflict of values. They did not complain that the leader’s views were mistaken. They believed that he was a bad person.

Third, these scholars have found that religious and factual beliefs play different roles in interpreting the same events. Religious beliefs explain why, rather than how. People who understand readily that diseases are caused by natural processes might still attribute sickness at a particular time to demons, or healing to an act of God. The psychologist Cristine H. Legare and her colleagues recently demonstrated that people use both natural and supernatural explanations in this interdependent way across many cultures. They tell a story, as recounted by Tracy Kidder’s book on the anthropologist and physician Paul Farmer, about a woman who had taken her tuberculosis medication and been cured — and who then told Dr. Farmer that she was going to get back at the person who had used sorcery to make her ill. “But if you believe that,” he cried, “why did you take your medicines?” In response to the great doctor she replied, in essence, “Honey, are you incapable of complexity?”

Moreover, people’s reliance on supernatural explanations increases as they age. It may be tempting to think that children are more likely than adults to reach out to magic to explain something, and that they increasingly put that mind-set to the side as they grow up, but the reverse is true. It’s the young kids who seem skeptical when researchers ask them about gods and ancestors, and the adults who seem clear and firm. It seems that supernatural ideas do things for adults they do not yet do for children.

Finally, scholars have determined that people don’t use rational, instrumental reasoning when they deal with religious beliefs. The anthropologist Scott Atran and his colleagues have shown that sacred values are immune to the normal cost-benefit trade-offs that govern other dimensions of our lives. Sacred values are insensitive to quantity (one cartoon can be a profound insult). They don’t respond to material incentives (if you offer people money to give up something that represents their sacred value, and they often become more intractable in their refusal). Sacred values may even have different neural signatures in the brain.

The danger point seems to be when people feel themselves to be completely fused with a group defined by its sacred value. When Mr. Atran and his colleagues surveyed young men in two Moroccan neighborhoods associated with militant jihad (one of them home to five men who helped plot the 2004 Madrid train bombings, and then blew themselves up), they found that those who described themselves as closest to their friends and who upheld Shariah law were also more likely to say that they would suffer grievous harm to defend Shariah law. These people become what Mr. Atran calls “devoted actors” who are unconditionally committed to their sacred value, and they are willing to die for it.

Read the entire article here.

Dark Matter May Cause Cancer and Earthquakes

Abell 1689

Leave aside the fact that there is no direct evidence for the existence of dark matter. In fact, theories that indirectly point to its existence seem rather questionable as well. That said, cosmologists are increasingly convinced that dark matter’s gravitational effects can be derived from recent observations of gravitationally lenses galaxy clusters. Some researchers postulate that this eerily murky non-substance — it doesn’t interact with anything in our visible universe except, perhaps, gravity — may be a cause for activities much closer to home. All very interesting.

From NYT:

Earlier this year, Dr. Sabine Hossenfelder, a theoretical physicist in Stockholm, made the jarring suggestion that dark matter might cause cancer. She was not talking about the “dark matter” of the genome (another term for junk DNA) but about the hypothetical, lightless particles that cosmologists believe pervade the universe and hold the galaxies together.

Though it has yet to be directly detected, dark matter is presumed to exist because we can see the effects of its gravity. As its invisible particles pass through our bodies, they could be mutating DNA, the theory goes, adding at an extremely low level to the overall rate of cancer.

It was unsettling to see two such seemingly different realms, cosmology and oncology, suddenly juxtaposed. But that was just the beginning. Shortly after Dr. Hossenfelder broached her idea in an online essay, Michael Rampino, a professor at New York University, added geology and paleontology to the picture.

Dark matter, he proposed in an article for the Royal Astronomical Society, is responsible for the mass extinctions that have periodically swept Earth, including the one that killed the dinosaurs.

His idea is based on speculations by other scientists that the Milky Way is sliced horizontally through its center by a thin disk of dark matter. As the sun, traveling around the galaxy, bobs up and down through this darkling plane, it generates gravitational ripples strong enough to dislodge distant comets from their orbits, sending them hurtling toward Earth.

An earlier version of this hypothesis was put forth last year by the Harvard physicists Lisa Randall and Matthew Reece. But Dr. Rampino has added another twist: During Earth’s galactic voyage, dark matter accumulates in its core. There the particles self-destruct, generating enough heat to cause deadly volcanic eruptions. Struck from above and below, the dinosaurs succumbed.

It is surprising to see something as abstract as dark matter take on so much solidity, at least in the human mind. The idea was invented in the early 1930s as a theoretical contrivance — a means of explaining observations that otherwise didn’t make sense.

Galaxies appear to be rotating so fast that they should have spun apart long ago, throwing off stars like sparks from a Fourth of July pinwheel. There just isn’t enough gravity to hold a galaxy together, unless you assume that it hides a huge amount of unseen matter — particles that neither emit or absorb light.

Some mavericks propose alternatives, attempting to tweak the equations of gravity to account for what seems like missing mass. But for most cosmologists, the idea of unseeable matter has become so deeply ingrained that it has become almost impossible to do without it.

Said to be five times more abundant than the stuff we can see, dark matter is a crucial component of the theory behind gravitational lensing, in which large masses like galaxies can bend light beams and cause stars to appear in unexpected parts of the sky.

That was the explanation for the spectacular observation of an “Einstein Cross” reported last month. Acting like an enormous lens, a cluster of galaxies deflected the light of a supernova into four images — a cosmological mirage. The light for each reflection followed a different path, providing glimpses of four different moments of the explosion.

Continue reading the main storyContinue reading the main story

But not even a galactic cluster exerts enough gravity to bend light so severely unless you postulate that most of its mass consists of hypothetical dark matter. In fact, astronomers are so sure that dark matter exists that they have embraced gravitational lensing as a tool to map its extent.

Dark matter, in other words, is used to explain gravitational lensing, and gravitational lensing is taken as more evidence for dark matter.

Some skeptics have wondered if this is a modern-day version of what ancient astronomers called “saving the phenomena.” With enough elaborations, a theory can account for what we see without necessarily describing reality. The classic example is the geocentric model of the heavens that Ptolemy laid out in the Almagest, with the planets orbiting Earth along paths of complex curlicues.

Ptolemy apparently didn’t care whether his filigrees were real. What was important to him was that his model worked, predicting planetary movements with great precision.

Modern scientists are not ready to settle for such subterfuge. To show that dark matter resides in the world and not just in their equations, they are trying to detect it directly.

Though its identity remains unknown, most theorists are betting that dark matter consists of WIMPs — weakly interacting massive particles. If they really exist, it might be possible to glimpse them when they interact with ordinary matter.

Read the entire article here.

Image: Abell 1689 galaxy cluster. Courtesy ofNASA, ESA, and D. Coe (NASA JPL/Caltech and STScI).