Tag Archives: death

Hedging With Death

Wiertz-burial

I’ve never met a hedge fund guy. I don’t think I ever will. They’re invariably male and white. Hedge fund guys move in very different circles than mere mortgage-bound morals like me, usually orbited by billions of dollars and extravagant toys like 200 ft yachts, Tuscan palazzos and a Lamborghini on every continent. At least that’s the popular stereotype.

I’m not sure I like the idea of hedge funds and hedge fund guys with their complex and obfuscated financial transactions, nano-second trading, risk shifting strategies, corporate raids and restructurings. I’m not against gazillionaires per se — but I much prefer the billionaires who invent and make things over those who simply bet and gamble and destroy.

So, it comes as no surprise to learn that one predatory hedge fund guy has found a way to make money from the death of strangers. His name is Donald F. “Jay” Lathen Jr. and his hedge fund is known as Eden Arc Capital Management. Lathen found a neat way for his hedge fund to profit from bonds and CDs (certificates of deposit) with survivor options. For each of his “death transactions” there would be two named survivors: himself or an associate and a terminally-ill patient at a nursing home or hospice. In exchange for naming Lathen as a financial beneficiary the patient would collect $10,000 from Lathen. Lathen would then rake in far greater sums from the redeemed bonds when the patient died.

Lathen’s trick was to enter into such deals only with patients that he calculated to be closest to death. Nothing illegal here, but certainly ethically head-scratching. Don’t you just love capitalism!

From Bloomberg:

A vital function of the financial system is to shift risk, but that is mostly a euphemism. Finance can’t make risks go away, or even really move them all that much. When the financial system shifts the risk of X happening from Y to Z, all that means is that Z gives Y money if X happens. If X was going to happen to Y, it’s still going to happen to Y. But now Y gets money.

Death is a central fact of human existence, the fundamental datum that gives meaning to life, but it is also a risk — you never know when it will happen! — and so the financial industry has figured out ways to shift it. Not in any supernatural sense, I mean, but in the regular financial-industry sense: by giving people money when death happens to them. One cannot know for certain how much of a consolation that is.

Another vital function of the financial system is to brutally punish the mispricing of risk through arbitrage. Actually I don’t really know how vital that one is, but people are pretty into it. If someone under- or overestimates a risk, someone else will find a way to make them pay for it. That’s how markets, even the market for death, stay efficient.

The normal way to shift the risk of death is life insurance — you die, the insurance company gives you money — but there are other, more esoteric versions, and they are more susceptible to arbitrage. One version involves “medium and long-term bonds and certificates of deposit (‘CDs’) that contain ‘survivor options’ or ‘death puts.'” Schematically, the idea is that a financial institution issues a bond that pays back $100 when it matures in 2040 or whatever. But if the buyer of the bond dies, he gets his $100 back immediately, instead of having to wait until 2040. He’s still dead, though.

But the bond can be owned jointly by two people, and when one of them dies, the other one gets the $100 back. If you and your friend buy a bond like that for $80, and then your friend dies, you make a quick $20.

But what are the odds of that? “Pretty low” was presumably the thinking of the companies issuing these bonds. But they didn’t reckon with Donald F. “Jay” Lathen Jr. and his hedge fund Eden Arc Capital Management:

Using contacts at nursing homes and hospices to identify patients that had a prognosis of less than six months left to live, and conducting due diligence into the patients’ medical condition, Lathen found Participants he could use to execute the Fund’s strategy. In return for agreeing to become a joint owner on an account with Lathen and/or another individual, the Participants were promised a fixed fee—typically, $10,000.

That is, needless to say, from the Securities and Exchange Commission administrative action against Lathen and Eden Arc. Lathen and a terminally ill patient would buy survivor-option bonds in a joint account, using Eden Arc’s money; the patient would die, Lathen would redeem the bonds, and Eden Arc would get the money. You are … somehow … not supposed to do this?

Read the entire story here.

Image: Antoine Wiertz’s painting of a man buried alive, 1854. Courtesy: Wiertz Museum, Netherlands / Wikipedia. Public Domain.

What’s Up With Middle-Aged White Males?

reggie-perrin-bbc

 

 

 

 

 

Not too long ago I came across a number of articles describing the high and growing incidence of suicide among middle-aged white males. Indeed, the suicide rate has skyrocketed 40 percent since the early 2000s.

Understandably, and no less sad, the increase in suicides seems to be driven by acute financial distress, chronic pain and/or illness, alcoholism and drug addiction.

Now, it seems that there is a corresponding increase in the number of white males faking their disappearance or fantasizing about it. A classic example is John Darwin from the UK, also known as “canoe man“, who faked his own death in 2002. But a key difference between this group and those who take their own lives is that the group of white males looking to disappear tends to be financially and (reasonably) emotionally stable.

So what on earth is going on?

A soon too be published book — Playing Dead: A Journey Through the World of Death Fraud, by Elizabeth Greenwood, examines what it’s like to fake your own death and burgeoning “disappearance” industry.

Here’s an excerpt:

Perhaps Todd’s plan for faking his death will remain in the realm of pure fantasy. But were he to put his plan into motion, Todd fits the prime demographic for a death fraudster. As a middle-aged, middle-class, heterosexual white man with a family, Todd represents the person most likely to fake his death. I’d noticed this disproportion in the demographics, and I wondered if there was anything to it. Privacy consultant Frank Ahearn and author of How to Disappear told me that the majority of his clients who sought to leave their lives behind were men, and J. J. Luna, author of How to Be Invisible: Protect Your Home, Your Children, Your Assets, and Your Life, told me that “far more men than women!” seek his “invisibility” services. In the 1996 guidebook How to Disappear Completely and Never Be Found, disappearance enthusiast Doug Richmond writes, “To a man of a certain age, there’s a bit of magic in the very thought of cutting all ties, of getting away from it all, of changing names and jobs and women and living happily ever after in a more salubrious clime!”

But why do these seemingly privileged men, who enjoy every perk that DNA has to offer, feel so hemmed in that they must go off the radar entirely? Perhaps it’s because although men still out-earn women, they then entangle themselves in financial trouble trying to enhance their fortunes. Maybe they shrug off because they feel less responsibility to see their children grow and flourish. Women shoulder the burdens of family and community—they take care of dying parents, snotty kids, shut-in neighbors—anyone before themselves. Though that might be relying too heavily on conventional wisdom about gender roles, the numbers speak for themselves: faking death seems to be a heavily male phenomenon. After combing through the stories and examining the traits that men like Todd share, I noticed that they all seemed to feel emasculated, made impotent, by their mundane lives. So, not earning enough money, they invest in a harebrained scheme. Underwhelmed with their monogamous sex lives, they take up with other women. Faking death seems to be not only a way out but also, counterintuitively, a way to be brave.

Read more here.

Image: Actor Leonard Rossiter plays Reginald Iolanthe Perrin, from The Fall and Rise of Reginald Perrin, a mid-1970s BBC sitcom. Courtesy: BBC.

As Clear As Black and White

Police-violence-screenshot-7Jul2016

The terrible tragedy that is wrought by guns in the United States continues unabated. And, it’s even more tragic when elements of our police forces fuel the unending violence, more often than not, enabled by racism. The governor of Minnesota Mark Dayton put it quite starkly yesterday, following the fatal shooting of Philando Castile on July 6, 2016, a resident of Falcon Heights, pulled over for a broken tail-light.

Just one day earlier, police officers in Baton Rouge, Louisiana shot and killed Alton Sterling.

Anti-police-violence-screenshot-8Jul2016

And, today we hear that the cycle of mistrust, hatred and deadly violence — courtesy of guns — has come full circle. A racist sniper (or snipers)  apparently targeting and murdering five white police officers in Dallas, Texas on July 7, 2016.

Images: Screenshots courtesy of Washington Post and WSJ, respectively.

Curate Your Own Death

six-feet-under-opening-title

It’s your funeral. So why not manage it yourself.

A new crop of smartphone and web apps aims to deliver end-of-life planning services directly to your small screen. Not only can you manage your own funeral, some of these services even help you curate your own afterlife. Apparently, apps like Cake, SafeBeyond, Everplans and Everest, are perfectly suited to millennials, many of whom already curate significant aspects of their lives online.

From the Guardian:

A young man is staring straight into the camera. He looks late 20s or early 30s, with a suede blazer and two-toned hipster glasses, and cheerfully waves as he introduces himself. “Hi, my name’s Will,” he tells the YouTube audience. “And I’m dead.”

“While my family is a bit upset, they’re not stressed. Because when I was among the land of the living, I made the incredibly smart move of signing up for Everest.”

Will flashes a smile. His family plans his funeral in the background, using the detailed plan he left behind.

Everest is a Houston-based funeral concierge, and the firm that commissioned Will’s upbeat, millennial-friendly video last fall from Sandwich Video, a Los Angeles production company popular with the tech set in Silicon Valley. Everest published the film in February 2016 as part of a campaign to target millennials, hoping even twentysomethings can be lured into thinking about their digital afterlives.

Everest is just one of a wave of apps and digital services that are emerging to help millennials plan their own #authentic mortal passings, right down to Instagram-worthy funerals. Last fall, rival apps Cake and SafeBeyond were released within one month of each other, and both hope to streamline end-of-life planning into one simple app.

Death apps promise to help a person organize his or her entire online life into a bundle of digital living wills, funeral plans, multimedia memorial portfolios and digital estate arrangements. It could be the mother of all personal media accounts, designed to store all of a person’s online passwords in one spot, for a successor to retrieve after he or she dies.

But millennials already curate their digital lives to perfection on social media. So how much are these “death apps” adding just another layer of pressure to personalize yet another stage of their lives?

Read the entire story here.

Image: Six Feet Under, opening title. Courtesy: HBO / Wikia.

The Increasing Mortality of White Males

This is the type of story that you might not normally, and certainly should not, associate with the world’s richest country. In a reversal of a long-established trend, death rates are increasing for less educated, white males. The good news is that death rates continue to fall for other demographic and racial groups, especially Hispanics and African Americans. So, what is happening to white males?

From the NYT:

It’s disturbing and puzzling news: Death rates are rising for white, less-educated Americans. The economists Anne Case and Angus Deaton reported in December that rates have been climbing since 1999 for non-Hispanic whites age 45 to 54, with the largest increase occurring among the least educated. An analysis of death certificates by The New York Times found similar trends and showed that the rise may extend to white women.

Both studies attributed the higher death rates to increases in poisonings and chronic liver disease, which mainly reflect drug overdoses and alcohol abuse, and to suicides. In contrast, death rates fell overall for blacks and Hispanics.

Why are whites overdosing or drinking themselves to death at higher rates than African-Americans and Hispanics in similar circumstances? Some observers have suggested that higher rates of chronic opioid prescriptions could be involved, along with whites’ greater pessimism about their finances.

Yet I’d like to propose a different answer: what social scientists call reference group theory. The term “reference group” was pioneered by the social psychologist Herbert H. Hyman in 1942, and the theory was developed by the Columbia sociologist Robert K. Merton in the 1950s. It tells us that to comprehend how people think and behave, it’s important to understand the standards to which they compare themselves.

How is your life going? For most of us, the answer to that question means comparing our lives to the lives our parents were able to lead. As children and adolescents, we closely observed our parents. They were our first reference group.

And here is one solution to the death-rate conundrum: It’s likely that many non-college-educated whites are comparing themselves to a generation that had more opportunities than they have, whereas many blacks and Hispanics are comparing themselves to a generation that had fewer opportunities.

Read the entire article here.

Words Before Death

SQ_Lethal_Injection_RoomA team of psychologists recently compiled and assessed the last words of prison inmates who were facing execution in Texas.

I was surprised to learn of a publicly accessible “last statement” database, available via Texas’ department of criminal justice.

Whether you subscribe to the idea that the death penalty is just [I do not] or not, you will surely find these final utterances moving — time for some reflection.

From the Independent:

Psychologists have analysed the last words of inmates who were condemned to death in Texas.

In a new paper, published in Frontiers in Psychology, researchers Dr. Sarah Hirschmüller and Dr. Boris Egloff used a database of last statements of inmates on death row and found the majority of the statements to be positive.

The researchers theorise that the inmates, the average age of whom in the current dataset is just over 39, expressed positive sentiments, because their minds were working in overdrive to avert them from fearing their current situation.

This is called ‘Terror-Management Theory’ (TMT). The concept is that people search for meaning when confronted with terror in a bid to maintain self-esteem and that “individuals employ a wide range of cognitive and behavioural efforts to regulate the anxiety that mortality salience evokes.”

Read more here.

Image: Execution room in the San Quentin State Prison in California. Public Domain.

RIP: Maurice White

Maurice_White_1982

We’ve lost another great musical innovator. I’m sick and tired of my artistic heroes dying. But, at the very least, I still have the sounds and the visions.

More on the sad passing of Maurice White from Rolling Stone, NYT, USA Today, BBC News, and CNN.

Image: Maurice White performing with Earth, Wind, and Fire at the Ahoy Rotterdam; 1982. Courtesy: Chris Hakkens – http://www.flickr.com/photos/chris_hakkens/4638840128/in/photostream/

A Googol Years From Now

If humanity makes it the next few years and decades without destroying itself and the planet, we can ponder the broader fate of our universal home. Assuming humanity escapes the death of our beautiful local star (in 4-5 billion years or so) and the merging of our very own Milky Way and the Andromeda galaxy (around 7-10 billion years), we’ll be toast in a googol years. Actually, we and everything else in the cosmos will be more like a cold, dark particle soup. By the way, a googol is a rather large number — 10100. That gives us plenty of time to fix ourselves.

From Space:

Yes, the universe is dying. Get over it.

 Well, let’s back up. The universe, as defined as “everything there is, in total summation,” isn’t going anywhere anytime soon. Or ever. If the universe changes into something else far into the future, well then, that’s just more universe, isn’t it?

But all the stuff in the universe? That’s a different story. When we’re talking all that stuff, then yes, everything in the universe is dying, one miserable day at a time.

You may not realize it by looking at the night sky, but the ultimate darkness is already settling in. Stars first appeared on the cosmic stage rather early — more than 13 billion years ago; just a few hundred million years into this Great Play. But there’s only so much stuff in the universe, and only so many opportunities to make balls of it dense enough to ignite nuclear fusion, creating the stars that fight against the relentless night.

The expansion of the universe dilutes everything in it, meaning there are fewer and fewer chances to make the nuclear magic happen. And around 10 billion years ago, the expansion reached a tipping point. The matter in the cosmos was spread too thin. The engines of creation shut off. The curtain was called: the epoch of peak star formation has already passed, and we are currently living in the wind-down stage. Stars are still born all the time, but the birth rate is dropping.

At the same time, that dastardly dark energy is causing the expansion of the universe to accelerate, ripping galaxies away from each other faster than the speed of light (go ahead, say that this violates some law of physics, I dare you), drawing them out of the range of any possible contact — and eventually, visibility — with their neighbors. With the exception of the Andromeda Galaxy and a few pathetic hangers-on, no other galaxies will be visible. We’ll become very lonely in our observable patch of the universe.

The infant universe was a creature of heat and light, but the cosmos of the ancient future will be a dim, cold animal.

The only consolation is the time scale involved. You thought 14 billion years was a long time? The numbers I’m going to present are ridiculous, even with exponential notation. You can’t wrap your head around it. They’re just … big.

For starters, we have at least 2 trillion years until the last sun is born, but the smallest stars will continue to burn slow and steady for another 100 trillion years in a cosmic Children of Men. Our own sun will be long gone by then, heaving off its atmosphere within the next 5 billion years and charcoaling the Earth. Around the same time, the Milky Way and Andromeda galaxies will collide, making a sorry mess of the local system.

At the end of this 100-trillion-year “stelliferous” era, the universe will only be left with the … well, leftovers: white dwarves (some cooled to black dwarves), neutron stars and black holes. Lots of black holes.

Welcome to the Degenerate Era, a state that is as sad as it sounds. But even that isn’t the end game. Oh no, it gets worse. After countless gravitational interactions, planets will get ejected from their decaying systems and galaxies themselves will dissolve. Losing cohesion, our local patch of the universe will be a disheveled wreck of a place, with dim, dead stars scattered about randomly and black holes haunting the depths.

The early universe was a very strange place, and the late universe will be equally bizarre. Given enough time, things that seem impossible become commonplace, and objects that appear immutable … uh, mutate. Through a process called quantum tunneling, any solid object will slowly “leak” atoms, dissolving. Because of this, gone will be the white dwarves, the planets, the asteroids, the solid.

Even fundamental particles are not immune: given 10^34 years, the neutrons in neutron stars will break apart into their constituent particles. We don’t yet know if the proton is stable, but if it isn’t, it’s only got 10^40 years before it meets its end.

With enough time (and trust me, we’ve got plenty of time), the universe will consist of nothing but light particles (electrons, neutrinos and their ilk), photons and black holes. The black holes themselves will probably dissolve via Hawking Radiation, briefly illuminating the impenetrable darkness as they decay.

After 10^100 years (but who’s keeping track at this point?), nothing macroscopic remains. Just a weak soup of particles and photons, spread so thin that they hardly ever interact.

Read the entire article here.

In case, you’ve forgotten, a googol is 10100 (10 to the power of 100) or 10 followed by 100 zeros. And, yes, that’s how the company Google derived its name.

Time for the Bucket List to Kick the Bucket

For the same reasons that New Year’s resolutions are daft, it’s time to ditch the bucket list. Columnist Steven Thrasher rightly argues that your actions to get something done or try something new should be driven by your gusto for life — passion, curiosity, wonder, joy — rather than dictated by a check box because you’re one step closer to death. Signs that it’s time to ditch the bucket list: when the idea is co-opted by corporations, advertisers and Hollywood; when motivational posters appear in hallways; and when physical bucket list buckets and notepads go on sale at Pottery Barn or Walmart.

From the Guardian:

Before each one of us dies, let’s wipe the “bucket list” from our collective vocabulary.

I hate the term “the bucket list.” The phrase, a list of things one wants to do in life before one dies or “kicks the bucket”, is the kind of hackneyed, cliche, stupid and insipid term only we Americans can come up with.

Even worse, “the bucket list” has become an excuse for people to couch things they actually desire to try doing as only socially acceptable if framed in the face of their death. It’s as if pleasure, curiosity and fun weren’t reasons enough for action.

If you want to try doing something others might find strange or unorthodox – write a novel, learn to tap dance, engage in a rim job, field dress a deer, climb Everest, go out in drag for a night – why do you need any justification at all? And certainly, why would you need an explanation that is only justifiable in terms of kicking the bucket?

According to the Wall Street Journal, the phrase “bucket list” comes to us from the banal mind of screenwriter Justin Zackham, who developed a list of things he wanted to do before he died. Years later, his “bucket list” became the title of his corny 2007 film starring Jack Nicholson and Morgan Freeman. It’s about two old men with terminal cancer who want to live it up before they die. That, if anyone at all, is who should be using the term “bucket list”. They want to do something with the finite time they know they have left? Fine.

But bucket list has trickled down to everday use by the perfectly healthy, the exceptionally young, and most of all, to douche bags. I realized this at Burning Man last week. Often, when I asked exceptionally boring people what had drawn them to Black Rock City, they’d say: “It was on my bucket list!”

Really? You wanted to schlep out to the desert and face freezing lows, scorching highs and soul crushing techno simply because you’re going to die someday?

There’s a funny dynamic sometimes when I go on a long trip while I’m out of work. When I backpacked through Asia and Europe in 2013, people (usually friends chained to a spouse, children and a mortgage) would sometimes awkwardly say to me: “Well, it will be the trip of a lifetime!” It was a good trip, but just one of many great journeys I’ve taken in my life so far. My adventures might interrupt someone else’s idea of what’s “normal.” But travel isn’t something I do to fulfil my “bucket list”; travel is a way of life for me. I do not rush into a trip thinking: “Good Christ, I could die tomorrow!” I don’t travel in place of the stable job or partner or kids I may or may not ever have. I do it as often as I can because it brings me joy.

Read the entire column here.

Kodokushi. A Lonely Death

As we age many of us tend to ponder our legacies. We wonder if we did good throughout our lives; we wonder if we’ll be remembered. Then we die.

Some will pass on treasured mementos to their descendents, families and friends; others — usually the one percenters — will cast their names on buildings, art bequests, research funds, and academic chairs. And yet others may not entrust any physical objects to their survivors, but nonetheless they’ll leave behind even more significant artifacts: trails of goodwill, moral frameworks, positive behaviors and traits, sound knowledge and teachings, passion, wonder.

Some of us will die in our sleep. A few will die in accidents or at the hands of others. Many of us will die in hospitals or clinics, attached to our technologies, sometimes attended by nearest and dearest, sometimes attended only by clinicians.

Sadly, some will die alone. Paradoxically, despite our increasing technologically enabled interconnectedness this phenomenon is on the increase, especially in aging societies with a low birth rate. Japan is a striking example — to such an extent that the Japanese even have a word for it: kodokushi or “lonely death”. Sadder still, where there are kodokushi victims there are now removal companies dedicated to their cleanup.

From Roads and Kingdoms:

Three months ago in an apartment on the outskirts of Osaka, Japan, Haruki Watanabe died alone. For weeks his body slowly decomposed, slouched in its own fluids and surrounded by fetid, fortnight-old food. He died of self-neglect, solitude, and a suspected heart problem. At 60, Watanabe, wasn’t old, nor was he especially poor. He had no friends, no job, no wife, and no concerned children. His son hadn’t spoken to him in years, nor did he want to again.

For three months no one called, no one knew, no one cared. For three months Watanabe rotted in his bedsheets, alongside pots of instant ramen and swarming cockroaches. The day that someone eventually called, he came not out of concern but out of administration. Watanabe had run out of money, and his bank had stopped paying the rent. The exasperated landlord, Toru Suzuki, had rung and rung, but no one had picked up. Sufficiently angry, he made the trip from his own home, in downtown Osaka, to the quiet suburb where his lodger lived. (Both men’s names are pseudonyms.)

First, there was the smell, a thick, noxious sweetness oozing from beneath the door frame. Second, there was the sight, the shape of a mortally slumped corpse beneath urine-soaked bedsheets. Third, there was the reality: Suzuki had come to collect his dues but had instead found his tenant’s dead body.

Disgusted, angry, but mostly shocked that this could happen to him, the landlord rang the police. The police came; they investigated with procedural dispassion and declared the death unsuspicious. This wasn’t suicide in the traditional sense, they said, but it did seem that the deceased had wanted to die. They’d seen it before, and it was an increasingly common occurrence throughout Japan: a single man dying, essentially, from loneliness.

They noted down what was required by their forms, wrapped up the body in officialdom, tied it with red tape, and removed it amid gawps and gags of inquisitive neighbors. The police then departed for the cemetery, where, because no family member had stepped forward to claim the body, they would intern Watanabe in an unmarked grave alongside the rest of Japan’s forgotten dead.

Suzuki was now left to his festering property and precarious financials. He was concerned. He didn’t know who to call or how to deal with the situation. In Japan, suicide can dramatically reduce the value of a property, and although this wasn’t suicide, his neighbors had seen enough; the gossip would spread fast. He heard whispers of kodokushi, a word bandied about since the Great Hanshin earthquake in 1995, when thousands of elderly Japanese were relocated to different residences and started dying alone, ostracized or isolated from family and friends. But what did that really mean for Suzuki, and how was he going to deal with it? Like most Japanese, he had heard of the “lonely death” but had not really believed in it; he certainly didn’t know what to do in such circumstances. So he turned to the Internet, and after hours of fruitless searching found a company called Risk-Benefit, run by a man named Toru Koremura.

With no other options he picked up the phone and gave the company a call.

With one of the fastest aging populations in the world and traditional family structures breaking down, Japan’s kodokushi phenomenon is becoming harder to ignore—not that the government and the Japanese people don’t do their best to sweep it under the carpet. Inaccurate statistics abound, with confusing definitions of what is and isn’t considered kodokushi being created in the process. According to the Ministry of Health, Labour and Welfare, there were some 3,700 “unaccompanied deaths” in Japan in 2013. However, other experts estimate the number is nearer 30,000 a year.

Scott North, a sociologist at Osaka University, argues that this extreme divergence could be the result of experts including some forms of suicide (of which there are around 27,000 cases a year in Japan) into the category of kodokushi. It could also be the result of bad accounting. Recently, senior Japanese bureaucrats admitted to having lost track of more than 250,000 people older than age 100. In a case that made international headlines in 2010, Sogen Kato, thought to be Tokyo’s oldest man at 111 years of age, turned out to have been mummified in his own apartment for more than 30 years.

Read the entire story here.

Death Explained

StillLifeWithASkull

Let’s leave the mysteries of the spiritual after-life aside for our various religions to fight over, and concentrate on what really happens after death. It many not please many aesthetes, but the cyclic process is beautiful nonetheless.

From Raw Story:

“It might take a little bit of force to break this up,” says mortician Holly Williams, lifting John’s arm and gently bending it at the fingers, elbow and wrist. “Usually, the fresher a body is, the easier it is for me to work on.”

Williams speaks softly and has a happy-go-lucky demeanour that belies the nature of her work. Raised and now employed at a family-run funeral home in north Texas, she has seen and handled dead bodies on an almost daily basis since childhood. Now 28 years old, she estimates that she has worked on something like 1,000 bodies.

Her work involves collecting recently deceased bodies from the Dallas–Fort Worth area and preparing them for their funeral.

“Most of the people we pick up die in nursing homes,” says Williams, “but sometimes we get people who died of gunshot wounds or in a car wreck. We might get a call to pick up someone who died alone and wasn’t found for days or weeks, and they’ll already be decomposing, which makes my work much harder.”

John had been dead about four hours before his body was brought into the funeral home. He had been relatively healthy for most of his life. He had worked his whole life on the Texas oil fields, a job that kept him physically active and in pretty good shape. He had stopped smoking decades earlier and drank alcohol moderately. Then, one cold January morning, he suffered a massive heart attack at home (apparently triggered by other, unknown, complications), fell to the floor, and died almost immediately. He was just 57 years old.

Now, John lay on Williams’ metal table, his body wrapped in a white linen sheet, cold and stiff to the touch, his skin purplish-grey – telltale signs that the early stages of decomposition were well under way.

Self-digestion

Far from being ‘dead’, a rotting corpse is teeming with life. A growing number of scientists view a rotting corpse as the cornerstone of a vast and complex ecosystem, which emerges soon after death and flourishes and evolves as decomposition proceeds.

Decomposition begins several minutes after death with a process called autolysis, or self-digestion. Soon after the heart stops beating, cells become deprived of oxygen, and their acidity increases as the toxic by-products of chemical reactions begin to accumulate inside them. Enzymes start to digest cell membranes and then leak out as the cells break down. This usually begins in the liver, which is rich in enzymes, and in the brain, which has a high water content. Eventually, though, all other tissues and organs begin to break down in this way. Damaged blood cells begin to spill out of broken vessels and, aided by gravity, settle in the capillaries and small veins, discolouring the skin.

Body temperature also begins to drop, until it has acclimatised to its surroundings. Then, rigor mortis – “the stiffness of death” – sets in, starting in the eyelids, jaw and neck muscles, before working its way into the trunk and then the limbs. In life, muscle cells contract and relax due to the actions of two filamentous proteins (actin and myosin), which slide along each other. After death, the cells are depleted of their energy source and the protein filaments become locked in place. This causes the muscles to become rigid and locks the joints.

During these early stages, the cadaveric ecosystem consists mostly of the bacteria that live in and on the living human body. Our bodies host huge numbers of bacteria; every one of the body’s surfaces and corners provides a habitat for a specialised microbial community. By far the largest of these communities resides in the gut, which is home to trillions of bacteria of hundreds or perhaps thousands of different species.

The gut microbiome is one of the hottest research topics in biology; it’s been linked to roles in human health and a plethora of conditions and diseases, from autism and depression to irritable bowel syndrome and obesity. But we still know little about these microbial passengers. We know even less about what happens to them when we die.

Putrefaction

Scattered among the pine trees in Huntsville, Texas, lie around half a dozen human cadavers in various stages of decay. The two most recently placed bodies are spread-eagled near the centre of the small enclosure with much of their loose, grey-blue mottled skin still intact, their ribcages and pelvic bones visible between slowly putrefying flesh. A few metres away lies another, fully skeletonised, with its black, hardened skin clinging to the bones, as if it were wearing a shiny latex suit and skullcap. Further still, beyond other skeletal remains scattered by vultures, lies a third body within a wood and wire cage. It is nearing the end of the death cycle, partly mummified. Several large, brown mushrooms grow from where an abdomen once was.

For most of us the sight of a rotting corpse is at best unsettling and at worst repulsive and frightening, the stuff of nightmares. But this is everyday for the folks at the Southeast Texas Applied Forensic Science Facility. Opened in 2009, the facility is located within a 247-acre area of National Forest owned by Sam Houston State University (SHSU). Within it, a nine-acre plot of densely wooded land has been sealed off from the wider area and further subdivided, by 10-foot-high green wire fences topped with barbed wire.

In late 2011, SHSU researchers Sibyl Bucheli and Aaron Lynne and their colleagues placed two fresh cadavers here, and left them to decay under natural conditions.

Once self-digestion is under way and bacteria have started to escape from the gastrointestinal tract, putrefaction begins. This is molecular death – the breakdown of soft tissues even further, into gases, liquids and salts. It is already under way at the earlier stages of decomposition but really gets going when anaerobic bacteria get in on the act.

Putrefaction is associated with a marked shift from aerobic bacterial species, which require oxygen to grow, to anaerobic ones, which do not. These then feed on the body’s tissues, fermenting the sugars in them to produce gaseous by-products such as methane, hydrogen sulphide and ammonia, which accumulate within the body, inflating (or ‘bloating’) the abdomen and sometimes other body parts.

This causes further discolouration of the body. As damaged blood cells continue to leak from disintegrating vessels, anaerobic bacteria convert haemoglobin molecules, which once carried oxygen around the body, into sulfhaemoglobin. The presence of this molecule in settled blood gives skin the marbled, greenish-black appearance characteristic of a body undergoing active decomposition.

Colonisation

When a decomposing body starts to purge, it becomes fully exposed to its surroundings. At this stage, the cadaveric ecosystem really comes into its own: a ‘hub’ for microbes, insects and scavengers.

Two species closely linked with decomposition are blowflies and flesh flies (and their larvae). Cadavers give off a foul, sickly-sweet odour, made up of a complex cocktail of volatile compounds that changes as decomposition progresses. Blowflies detect the smell using specialised receptors on their antennae, then land on the cadaver and lay their eggs in orifices and open wounds.

Each fly deposits around 250 eggs that hatch within 24 hours, giving rise to small first-stage maggots. These feed on the rotting flesh and then moult into larger maggots, which feed for several hours before moulting again. After feeding some more, these yet larger, and now fattened, maggots wriggle away from the body. They then pupate and transform into adult flies, and the cycle repeats until there’s nothing left for them to feed on.

Under the right conditions, an actively decaying body will have large numbers of stage-three maggots feeding on it. This ‘maggot mass’ generates a lot of heat, raising the inside temperature by more than 10°C. Like penguins huddling in the South Pole, individual maggots within the mass are constantly on the move. But whereas penguins huddle to keep warm, maggots in the mass move around to stay cool.

“It’s a double-edged sword,” Bucheli explains, surrounded by large toy insects and a collection of Monster High dolls in her SHSU office. “If you’re always at the edge, you might get eaten by a bird, and if you’re always in the centre, you might get cooked. So they’re constantly moving from the centre to the edges and back.”

Purging

“We’re looking at the purging fluid that comes out of decomposing bodies,” says Daniel Wescott, director of the Forensic Anthropology Center at Texas State University in San Marcos.

Wescott, an anthropologist specialising in skull structure, is using a micro-CT scanner to analyse the microscopic structure of the bones brought back from the body farm. He also collaborates with entomologists and microbiologists – including Javan, who has been busy analysing samples of cadaver soil collected from the San Marcos facility – as well as computer engineers and a pilot, who operate a drone that takes aerial photographs of the facility.

“I was reading an article about drones flying over crop fields, looking at which ones would be best to plant in,” he says. “They were looking at near-infrared, and organically rich soils were a darker colour than the others. I thought if they can do that, then maybe we can pick up these little circles.”

Those “little circles” are cadaver decomposition islands. A decomposing body significantly alters the chemistry of the soil beneath it, causing changes that may persist for years. Purging – the seeping of broken-down materials out of what’s left of the body – releases nutrients into the underlying soil, and maggot migration transfers much of the energy in a body to the wider environment. Eventually, the whole process creates a ‘cadaver decomposition island’, a highly concentrated area of organically rich soil. As well as releasing nutrients into the wider ecosystem, this attracts other organic materials, such as dead insects and faecal matter from larger animals.

According to one estimate, an average human body consists of 50–75 per cent water, and every kilogram of dry body mass eventually releases 32 g of nitrogen, 10 g of phosphorous, 4 g of potassium and 1 g of magnesium into the soil. Initially, it kills off some of the underlying and surrounding vegetation, possibly because of nitrogen toxicity or because of antibiotics found in the body, which are secreted by insect larvae as they feed on the flesh. Ultimately, though, decomposition is beneficial for the surrounding ecosystem.

According to the laws of thermodynamics, energy cannot be created or destroyed, only converted from one form to another. In other words: things fall apart, converting their mass to energy while doing so. Decomposition is one final, morbid reminder that all matter in the universe must follow these fundamental laws. It breaks us down, equilibrating our bodily matter with its surroundings, and recycling it so that other living things can put it to use.

Ashes to ashes, dust to dust.

Read the entire article here.

Image: Still-Life with a Skull, 17th-century painting by Philippe de Champaigne. Public Domain.

MondayMap: The State of Death

distinctive-causes-of-death-by-state

It’s a Monday, so why not dwell on an appropriately morbid topic — death. Or, to be more precise, a really cool map that shows the most distinctive causes of death for each state. We know that across the United States in general the most common causes of death are heart disease and cancer. However, looking a little deeper shows other, secondary causes that vary by state. So, leaving aside the top two, you will see that a resident of Tennessee is more likely to die from “accidental discharge of firearms”, while someone from Alabama will succumb to syphilis. Interestingly, Texans are more likely to depart this mortal coil from tuberculosis; Georgians from “abnormal clinical problems not elsewhere classified”. While Alaskans — no surprise here — lead the way in deaths from airplane, boating and “unspecified transport accidents”.

Read more here.

Map: Distinctive cause of death by state. Courtesy of Francis Boscoe, New York State Cancer Registry.

 

The Damned Embuggerance

Google-search-terry-pratchett-books

Sadly, genre-busting author Sir Terry Pratchett succumbed to DEATH on March 12, 2015. Luckily, for those of us still fending off the clutches of Reaper Man we have seventy-plus works of his to keep us company in the darkness.

So now that our world contains a little less magic it’s important to remind ourselves of a few choice words of his:

A man is not truly dead while his name is still spoken.

Stories of imagination tend to upset those without one.

It’s not worth doing something unless someone, somewhere, would much rather you weren’t doing it.

The truth may be out there, but the lies are inside your head.

Goodness is about what you do. Not who you pray to.

From the Guardian:

Neil Gaiman led tributes from the literary, entertainment and fantasy worlds to Terry Pratchett after the author’s death on Thursday, aged 66.

The author of the Discworld novels, which sold in the tens of millions worldwide, had been afflicted with a rare form of early-onset Alzheimer’s disease.

Gaiman, who collaborated with Pratchett on the huge hit Good Omens, tweeted: “I will miss you, Terry, so much,” pointing to “the last thing I wrote about you”, on the Guardian.

“Terry Pratchett is not a jolly old elf at all,” wrote Gaiman last September. “Not even close. He’s so much more than that. As Terry walks into the darkness much too soon, I find myself raging too: at the injustice that deprives us of – what? Another 20 or 30 books? Another shelf-full of ideas and glorious phrases and old friends and new, of stories in which people do what they really do best, which is use their heads to get themselves out of the trouble they got into by not thinking? … I rage at the imminent loss of my friend. And I think, ‘What would Terry do with this anger?’ Then I pick up my pen, and I start to write.”

Appealing to readers to donate to Alzheimer’s research, Gaiman added on his blog: “Thirty years and a month ago, a beginning author met a young journalist in a Chinese Restaurant, and the two men became friends, and they wrote a book, and they managed to stay friends despite everything. Last night, the author died.

“There was nobody like him. I was fortunate to have written a book with him, when we were younger, which taught me so much.

“I knew his death was coming and it made it no easier.”

Read the entire article here.

Image courtesy of Google Search.

Syndrome X

DNA_Structure

The quest for immortality or even great longevity has probably led humans since they first became self-aware. Entire cultural movements and industries are founded on the desire to enhance and extend our lives. Genetic research, of course, may eventually unlock some or all of life and death’s mysteries. In the meantime, groups of dedicated scientists continue to look for for the foundation of aging with a view to understanding the process and eventually slowing (and perhaps stopping) it. Richard Walker is one of these singularly focused researchers.

From the BBC:

Richard Walker has been trying to conquer ageing since he was a 26-year-old free-loving hippie. It was the 1960s, an era marked by youth: Vietnam War protests, psychedelic drugs, sexual revolutions. The young Walker relished the culture of exultation, of joie de vivre, and yet was also acutely aware of its passing. He was haunted by the knowledge that ageing would eventually steal away his vitality – that with each passing day his body was slightly less robust, slightly more decayed. One evening he went for a drive in his convertible and vowed that by his 40th birthday, he would find a cure for ageing.

Walker became a scientist to understand why he was mortal. “Certainly it wasn’t due to original sin and punishment by God, as I was taught by nuns in catechism,” he says. “No, it was the result of a biological process, and therefore is controlled by a mechanism that we can understand.”

Scientists have published several hundred theories of ageing, and have tied it to a wide variety of biological processes. But no one yet understands how to integrate all of this disparate information.

Walker, now 74, believes that the key to ending ageing may lie in a rare disease that doesn’t even have a real name, “Syndrome X”. He has identified four girls with this condition, marked by what seems to be a permanent state of infancy, a dramatic developmental arrest. He suspects that the disease is caused by a glitch somewhere in the girls’ DNA. His quest for immortality depends on finding it.

It’s the end of another busy week and MaryMargret Williams is shuttling her brood home from school. She drives an enormous SUV, but her six children and their coats and bags and snacks manage to fill every inch. The three big kids are bouncing in the very back. Sophia, 10, with a mouth of new braces, is complaining about a boy-crazy friend. She sits next to Anthony, seven, and Aleena, five, who are glued to something on their mother’s iPhone. The three little kids squirm in three car seats across the middle row. Myah, two, is mining a cherry slushy, and Luke, one, is pawing a bag of fresh crickets bought for the family gecko.

Finally there’s Gabrielle, who’s the smallest child, and the second oldest, at nine years old. She has long, skinny legs and a long, skinny ponytail, both of which spill out over the edges of her car seat. While her siblings giggle and squeal, Gabby’s dusty-blue eyes roll up towards the ceiling. By the calendar, she’s almost an adolescent. But she has the buttery skin, tightly clenched fingers and hazy awareness of a newborn.

Back in 2004, when MaryMargret and her husband, John, went to the hospital to deliver Gabby, they had no idea anything was wrong. They knew from an ultrasound that she would have clubbed feet, but so had their other daughter, Sophia, who was otherwise healthy. And because MaryMargret was a week early, they knew Gabby would be small, but not abnormally so. “So it was such a shock to us when she was born,” MaryMargret says.

Gabby came out purple and limp. Doctors stabilised her in the neonatal intensive care unit and then began a battery of tests. Within days the Williamses knew their new baby had lost the genetic lottery. Her brain’s frontal lobe was smooth, lacking the folds and grooves that allow neurons to pack in tightly. Her optic nerve, which runs between the eyes and the brain, was atrophied, which would probably leave her blind. She had two heart defects. Her tiny fists couldn’t be pried open. She had a cleft palate and an abnormal swallowing reflex, which meant she had to be fed through a tube in her nose. “They started trying to prepare us that she probably wouldn’t come home with us,” John says. Their family priest came by to baptise her.

Day after day, MaryMargret and John shuttled between Gabby in the hospital and 13-month-old Sophia at home. The doctors tested for a few known genetic syndromes, but they all came back negative. Nobody had a clue what was in store for her. Her strong Catholic family put their faith in God. “MaryMargret just kept saying, ‘She’s coming home, she’s coming home’,” recalls her sister, Jennie Hansen. And after 40 days, she did.

Gabby cried a lot, loved to be held, and ate every three hours, just like any other newborn. But of course she wasn’t. Her arms would stiffen and fly up to her ears, in a pose that the family nicknamed her “Harley-Davidson”. At four months old she started having seizures. Most puzzling and problematic, she still wasn’t growing. John and MaryMargret took her to specialist after specialist: a cardiologist, a gastroenterologist, a geneticist, a neurologist, an ophthalmologist and an orthopaedist. “You almost get your hopes up a little – ’This is exciting! We’re going to the gastro doctor, and maybe he’ll have some answers’,” MaryMargret says. But the experts always said the same thing: nothing could be done.

The first few years with Gabby were stressful. When she was one and Sophia two, the Williamses drove from their home in Billings, Montana, to MaryMargret’s brother’s home outside of St Paul, Minnesota. For nearly all of those 850 miles, Gabby cried and screamed. This continued for months until doctors realised she had a run-of-the-mill bladder infection. Around the same period, she acquired a severe respiratory infection that left her struggling to breathe. John and MaryMargret tried to prepare Sophia for the worst, and even planned which readings and songs to use at Gabby’s funeral. But the tiny toddler toughed it out.

While Gabby’s hair and nails grew, her body wasn’t getting bigger. She was developing in subtle ways, but at her own pace. MaryMargret vividly remembers a day at work when she was pushing Gabby’s stroller down a hallway with skylights in the ceiling. She looked down at Gabby and was shocked to see her eyes reacting to the sunlight. “I thought, ‘Well, you’re seeing that light!’” MaryMargret says. Gabby wasn’t blind, after all.

Despite the hardships, the couple decided they wanted more children. In 2007 MaryMargret had Anthony, and the following year she had Aleena. By this time, the Williamses had stopped trudging to specialists, accepting that Gabby was never going to be fixed. “At some point we just decided,” John recalls, “it’s time to make our peace.”

Mortal questions

When Walker began his scientific career, he focused on the female reproductive system as a model of “pure ageing”: a woman’s ovaries, even in the absence of any disease, slowly but inevitably slide into the throes of menopause. His studies investigated how food, light, hormones and brain chemicals influence fertility in rats. But academic science is slow. He hadn’t cured ageing by his 40th birthday, nor by his 50th or 60th. His life’s work was tangential, at best, to answering the question of why we’re mortal, and he wasn’t happy about it. He was running out of time.

So he went back to the drawing board. As he describes in his book, Why We Age, Walker began a series of thought experiments to reflect on what was known and not known about ageing.

Ageing is usually defined as the slow accumulation of damage in our cells, organs and tissues, ultimately causing the physical transformations that we all recognise in elderly people. Jaws shrink and gums recede. Skin slacks. Bones brittle, cartilage thins and joints swell. Arteries stiffen and clog. Hair greys. Vision dims. Memory fades. The notion that ageing is a natural, inevitable part of life is so fixed in our culture that we rarely question it. But biologists have been questioning it for a long time.

It’s a harsh world out there, and even young cells are vulnerable. It’s like buying a new car: the engine runs perfectly but is still at risk of getting smashed on the highway. Our young cells survive only because they have a slew of trusty mechanics on call. Take DNA, which provides the all-important instructions for making proteins. Every time a cell divides, it makes a near-perfect copy of its three-billion-letter code. Copying mistakes happen frequently along the way, but we have specialised repair enzymes to fix them, like an automatic spellcheck. Proteins, too, are ever vulnerable. If it gets too hot, they twist into deviant shapes that keep them from working. But here again, we have a fixer: so-called ‘heat shock proteins’ that rush to the aid of their misfolded brethren. Our bodies are also regularly exposed to environmental poisons, such as the reactive and unstable ‘free radical’ molecules that come from the oxidisation of the air we breathe. Happily, our tissues are stocked with antioxidants and vitamins that neutralise this chemical damage. Time and time again, our cellular mechanics come to the rescue.

Which leads to the biologists’ longstanding conundrum: if our bodies are so well tuned, why, then, does everything eventually go to hell?

One theory is that it all boils down to the pressures of evolution. Humans reproduce early in life, well before ageing rears its ugly head. All of the repair mechanisms that are important in youth – the DNA editors, the heat shock proteins, the antioxidants – help the young survive until reproduction, and are therefore passed down to future generations. But problems that show up after we’re done reproducing cannot be weeded out by evolution. Hence, ageing.

Most scientists say that ageing is not caused by any one culprit but by the breakdown of many systems at once. Our sturdy DNA mechanics become less effective with age, meaning that our genetic code sees a gradual increase in mutations. Telomeres, the sequences of DNA that act as protective caps on the ends of our chromosomes, get shorter every year. Epigenetic messages, which help turn genes on and off, get corrupted with time. Heat shock proteins run down, leading to tangled protein clumps that muck up the smooth workings of a cell. Faced with all of this damage, our cells try to adjust by changing the way they metabolise nutrients and store energy. To ward off cancer, they even know how to shut themselves down. But eventually cells stop dividing and stop communicating with each other, triggering the decline we see from the outside.

Scientists trying to slow the ageing process tend to focus on one of these interconnected pathways at a time. Some researchers have shown, for example, that mice on restricted-calorie diets live longer than normal. Other labs have reported that giving mice rapamycin, a drug that targets an important cell-growth pathway, boosts their lifespan. Still other groups are investigating substances that restore telomeres, DNA repair enzymes and heat shock proteins.

During his thought experiments, Walker wondered whether all of these scientists were fixating on the wrong thing. What if all of these various types of cellular damages were the consequences of ageing, but not the root cause of it? He came up with an alternative theory: that ageing is the unavoidable fallout of our development.

The idea sat on the back burner of Walker’s mind until the evening of 23 October 2005. He was working in his home office when his wife called out to him to join her in the family room. She knew he would want to see what was on TV: an episode of Dateline about a young girl who seemed to be “frozen in time”. Walker watched the show and couldn’t believe what he was seeing. Brooke Greenberg was 12 years old, but just 13 pounds (6kg) and 27 inches (69cm) long. Her doctors had never seen anything like her condition, and suspected the cause was a random genetic mutation. “She literally is the Fountain of Youth,” her father, Howard Greenberg, said.

Walker was immediately intrigued. He had heard of other genetic diseases, such as progeria and Werner syndrome, which cause premature ageing in children and adults respectively. But this girl seemed to be different. She had a genetic disease that stopped her development and with it, Walker suspected, the ageing process. Brooke Greenberg, in other words, could help him test his theory.

Uneven growth

Brooke was born a few weeks premature, with many birth defects. Her paediatrician labeled her with Syndrome X, not knowing what else to call it.

After watching the show, Walker tracked down Howard Greenberg’s address. Two weeks went by before Walker heard back, and after much discussion he was allowed to test Brooke. He was sent Brooke’s medical records as well as blood samples for genetic testing. In 2009, his team published a brief report describing her case.

Walker’s analysis found that Brooke’s organs and tissues were developing at different rates. Her mental age, according to standardised tests, was between one and eight months. Her teeth appeared to be eight years old; her bones, 10 years. She had lost all of her baby fat, and her hair and nails grew normally, but she had not reached puberty. Her telomeres were considerably shorter than those of healthy teenagers, suggesting that her cells were ageing at an accelerated rate.

All of this was evidence of what Walker dubbed “developmental disorganisation”. Brooke’s body seemed to be developing not as a coordinated unit, he wrote, but rather as a collection of individual, out-of-sync parts. “She is not simply ‘frozen in time’,” Walker wrote. “Her development is continuing, albeit in a disorganised fashion.”

The big question remained: why was Brooke developmentally disorganised? It wasn’t nutritional and it wasn’t hormonal. The answer had to be in her genes. Walker suspected that she carried a glitch in a gene (or a set of genes, or some kind of complex genetic programme) that directed healthy development. There must be some mechanism, after all, that allows us to develop from a single cell to a system of trillions of cells. This genetic programme, Walker reasoned, would have two main functions: it would initiate and drive dramatic changes throughout the organism, and it would also coordinate these changes into a cohesive unit.

Ageing, he thought, comes about because this developmental programme, this constant change, never turns off. From birth until puberty, change is crucial: we need it to grow and mature. After we’ve matured, however, our adult bodies don’t need change, but rather maintenance. “If you’ve built the perfect house, you would want to stop adding bricks at a certain point,” Walker says. “When you’ve built a perfect body, you’d want to stop screwing around with it. But that’s not how evolution works.” Because natural selection cannot influence traits that show up after we have passed on our genes, we never evolved a “stop switch” for development, Walker says. So we keep adding bricks to the house. At first this doesn’t cause much damage – a sagging roof here, a broken window there. But eventually the foundation can’t sustain the additions, and the house topples. This, Walker says, is ageing.

Brooke was special because she seemed to have been born with a stop switch. But finding the genetic culprit turned out to be difficult. Walker would need to sequence Brooke’s entire genome, letter by letter.

That never happened. Much to Walker’s chagrin, Howard Greenberg abruptly severed their relationship. The Greenbergs have not publicly explained why they ended their collaboration with Walker, and declined to comment for this article.

Second chance

In August 2009, MaryMargret Williams saw a photo of Brooke on the cover of People magazine, just below the headline “Heartbreaking mystery: The 16-year-old baby”. She thought Brooke sounded a lot like Gabby, so contacted Walker.

After reviewing Gabby’s details, Walker filled her in on his theory. Testing Gabby’s genes, he said, could help him in his mission to end age-related disease – and maybe even ageing itself.

This didn’t sit well with the Williamses. John, who works for the Montana Department of Corrections, often interacts with people facing the reality of our finite time on Earth. “If you’re spending the rest of your life in prison, you know, it makes you think about the mortality of life,” he says. What’s important is not how long you live, but rather what you do with the life you’re given. MaryMargret feels the same way. For years she has worked in a local dermatology office. She knows all too well the cultural pressures to stay young, and wishes more people would embrace the inevitability of getting older. “You get wrinkles, you get old, that’s part of the process,” she says.

But Walker’s research also had its upside. First and foremost, it could reveal whether the other Williams children were at risk of passing on Gabby’s condition.

For several months, John and MaryMargret hashed out the pros and cons. They were under no illusion that the fruits of Walker’s research would change Gabby’s condition, nor would they want it to. But they did want to know why. “What happened, genetically, to make her who she is?” John says. And more importantly: “Is there a bigger meaning for it?”

John and MaryMargret firmly believe that God gave them Gabby for a reason. Walker’s research offered them a comforting one: to help treat Alzheimer’s and other age-related diseases. “Is there a small piece that Gabby could present to help people solve these awful diseases?” John asks. “Thinking about it, it’s like, no, that’s for other people, that’s not for us.” But then he thinks back to the day Gabby was born. “I was in that delivery room, thinking the same thing – this happens to other people, not us.”

Still not entirely certain, the Williamses went ahead with the research.

Amassing evidence

Walker published his theory in 2011, but he’s only the latest of many researchers to think along the same lines. “Theories relating developmental processes to ageing have been around for a very long time, but have been somewhat under the radar for most researchers,” says Joao Pedro de Magalhaes, a biologist at the University of Liverpool. In 1932, for example, English zoologist George Parker Bidder suggested that mammals have some kind of biological “regulator” that stops growth after the animal reaches a specific size. Ageing, Bidder thought, was the continued action of this regulator after growth was done.

Subsequent studies showed that Bidder wasn’t quite right; there are lots of marine organisms, for example, that never stop growing but age anyway. Still, his fundamental idea of a developmental programme leading to ageing has persisted.

For several years, Stuart Kim’s group at Stanford University has been comparing which genes are expressed in young and old nematode worms. It turns out that some genes involved in ageing also help drive development in youth.

Kim suggested that the root cause of ageing is the “drift”, or mistiming, of developmental pathways during the ageing process, rather than an accumulation of cellular damage.

Other groups have since found similar patterns in mice and primates. One study, for example, found that many genes turned on in the brains of old monkeys and humans are the same as those expressed in young brains, suggesting that ageing and development are controlled by some of the same gene networks.

Perhaps most provocative of all, some studies of worms have shown that shutting down essential development genes in adults significantly prolongs life. “We’ve found quite a lot of genes in which this happened – several dozen,” de Magalhaes says.

Nobody knows whether the same sort of developmental-programme genes exist in people. But say that they do exist. If someone was born with a mutation that completely destroyed this programme, Walker reasoned, that person would undoubtedly die. But if a mutation only partially destroyed it, it might lead to a condition like what he saw in Brooke Greenberg or Gabby Williams. So if Walker could identify the genetic cause of Syndrome X, then he might also have a driver of the ageing process in the rest of us.

And if he found that, then could it lead to treatments that slow – or even end – ageing? “There’s no doubt about it,” he says.

Public stage

After agreeing to participate in Walker’s research, the Williamses, just like the Greenbergs before them, became famous. In January 2011, when Gabby was six, the television channel TLC featured her on a one-hour documentary. The Williams family also appeared on Japanese television and in dozens of newspaper and magazine articles.

Other than becoming a local celebrity, though, Gabby’s everyday life hasn’t changed much since getting involved in Walker’s research. She spends her days surrounded by her large family. She’ll usually lie on the floor, or in one of several cushions designed to keep her spine from twisting into a C shape. She makes noises that would make an outsider worry: grunting, gasping for air, grinding her teeth. Her siblings think nothing of it. They play boisterously in the same room, somehow always careful not to crash into her. Once a week, a teacher comes to the house to work with Gabby. She uses sounds and shapes on an iPad to try to teach cause and effect. When Gabby turned nine, last October, the family made her a birthday cake and had a party, just as they always do. Most of her gifts were blankets, stuffed animals and clothes, just as they are every year. Her aunt Jennie gave her make-up.

Walker teamed up with geneticists at Duke University and screened the genomes of Gabby, John and MaryMargret. This test looked at the exome, the 2% of the genome that codes for proteins. From this comparison, the researchers could tell that Gabby did not inherit any exome mutations from her parents – meaning that it wasn’t likely that her siblings would be able to pass on the condition to their kids. “It was a huge relief – huge,” MaryMargret says.

Still, the exome screening didn’t give any clues as to what was behind Gabby’s disease. Gabby carries several mutations in her exome, but none in a gene that would make sense of her condition. All of us have mutations littering our genomes. So it’s impossible to know, in any single individual, whether a particular mutation is harmful or benign – unless you can compare two people with the same condition.

All girls

Luckily for him, Walker’s continued presence in the media has led him to two other young girls who he believes have the same syndrome. One of them, Mackenzee Wittke, of Alberta, Canada, is now five years old, with has long and skinny limbs, just like Gabby. “We have basically been stuck in a time warp,” says her mother, Kim Wittke. The fact that all of these possible Syndrome X cases are girls is intriguing – it could mean that the crucial mutation is on their X chromosome. Or it could just be a coincidence.

Walker is working with a commercial outfit in California to compare all three girls’ entire genome sequences – the exome plus the other 98% of DNA code, which is thought to be responsible for regulating the expression of protein-coding genes.

For his theory, Walker says, “this is do or die – we’re going to do every single bit of DNA in these girls. If we find a mutation that’s common to them all, that would be very exciting.”

But that seems like a very big if.

Most researchers agree that finding out the genes behind Syndrome X is a worthwhile scientific endeavour, as these genes will no doubt be relevant to our understanding of development. They’re far less convinced, though, that the girls’ condition has anything to do with ageing. “It’s a tenuous interpretation to think that this is going to be relevant to ageing,” says David Gems, a geneticist at University College London. It’s not likely that these girls will even make it to adulthood, he says, let alone old age.

It’s also not at all clear that these girls have the same condition. Even if they do, and even if Walker and his collaborators discover the genetic cause, there would still be a steep hill to climb. The researchers would need to silence the same gene or genes in laboratory mice, which typically have a lifespan of two or three years. “If that animal lives to be 10, then we’ll know we’re on the right track,” Walker says. Then they’d have to find a way to achieve the same genetic silencing in people, whether with a drug or some kind of gene therapy. And then they’d have to begin long and expensive clinical trials to make sure that the treatment was safe and effective. Science is often too slow, and life too fast.

End of life

On 24 October 2013, Brooke passed away. She was 20 years old. MaryMargret heard about it when a friend called after reading it in a magazine. The news hit her hard. “Even though we’ve never met the family, they’ve just been such a part of our world,” she says.

MaryMargret doesn’t see Brooke as a template for Gabby – it’s not as if she now believes that she only has 11 years left with her daughter. But she can empathise with the pain the Greenbergs must be feeling. “It just makes me feel so sad for them, knowing that there’s a lot that goes into a child like that,” she says. “You’re prepared for them to die, but when it finally happens, you can just imagine the hurt.”

Today Gabby is doing well. MaryMargret and John are no longer planning her funeral. Instead, they’re beginning to think about what would happen if Gabby outlives them. (Sophia has offered to take care of her sister.) John turned 50 this year, and MaryMargret will be 41. If there were a pill to end ageing, they say they’d have no interest in it. Quite the contrary: they look forward to getting older, because it means experiencing the new joys, new pains and new ways to grow that come along with that stage of life.

Richard Walker, of course, has a fundamentally different view of growing old. When asked why he’s so tormented by it, he says it stems from childhood, when he watched his grandparents physically and psychologically deteriorate. “There was nothing charming to me about sedentary old people, rocking chairs, hot houses with Victorian trappings,” he says. At his grandparents’ funerals, he couldn’t help but notice that they didn’t look much different in death than they did at the end of life. And that was heartbreaking. “To say I love life is an understatement,” he says. “Life is the most beautiful and magic of all things.”

If his hypothesis is correct – who knows? – it might one day help prevent disease and modestly extend life for millions of people. Walker is all too aware, though, that it would come too late for him. As he writes in his book: “I feel a bit like Moses who, after wandering in the desert for most years of his life, was allowed to gaze upon the Promised Land but not granted entrance into it.”

 Read the entire story here.

Story courtesy of BBC and Mosaic under Creative Commons License.

Image: DNA structure. Courtesy of Wikipedia.

National Extinction Coming Soon

Based on declining fertility rates in some Asian nations a new study predicts complete national extinctions in the not too distant future.

From the Telegraph:

South Koreans will be ‘extinct’ by 2750 if nothing is done to halt the nation’s falling fertility rate, according to a study by The National Assembly Research Service in Seoul.

The fertility rate declined to a new low of 1.19 children per woman in 2013, the study showed, well below the fertility rate required to sustainSouth Korea‘s current population of 50 million people, the Chosun Ilbo reported.

In a simulation, the NARS study suggests that the population will shrink to 40 million in 2056 and 10 million in 2136. The last South Korean, the report indicates, will die in 2750, making it the first national group in the world to become extinct.

The simulation is a worst-case scenario and does not consider possible changes in immigration policy, for example.

The study, carried out at the request of Yang Seung-jo, a member of the opposition New Politics Alliance for Democracy, underlines the challenges facing a number of nations in the Asia-Pacific region.

Japan, Taiwan, Singapore and increasingly China are all experiencing growing financial pressures caused by rising healthcare costs and pension payments for an elderly population.

The problem is particularly serious in South Korea, where more than 38 per cent of the population is predicted to be of retirement age by 2050, according to the National Statistics Office. The equivalent figure in Japan is an estimated 39.6 per cent by 2050.

According to a 2012 study conducted by Tohoku University, Japan will go extinct in about one thousand years, with the last Japanese child born in 3011.

David Coleman, a population expert at Oxford University, has previously warned that South Korea’s fertility rate is so low that it threatens the existence of the nation.

The NARS study suggests that the southern Korean port city of Busan is most at risk, largely because of a sharp decline in the number of young and middle-aged residents, and that the last person will be born in the city in 2413.

Read the entire article here.

Robin Williams You Will Be Missed

Google-search-robin-williams

Mork returned to Ork this weekend; sadly, his creator Robin Williams passed away on August 11, 2014. He was 63. His unique comic genius will be sorely missed.

From NYT:

Some years ago, at a party at the Cannes Film Festival, I was leaning against a rail watching a fireworks display when I heard a familiar voice behind me. Or rather, at least a dozen voices, punctuating the offshore explosions with jokes, non sequiturs and off-the-wall pop-cultural, sexual and political references.

There was no need to turn around: The voices were not talking directly to me and they could not have belonged to anyone other than Robin Williams, who was extemporizing a monologue at least as pyrotechnically amazing as what was unfolding against the Mediterranean sky. I’m unable to recall the details now, but you can probably imagine the rapid-fire succession of accents and pitches — macho basso, squeaky girly, French, Spanish, African-American, human, animal and alien — entangling with curlicues of self-conscious commentary about the sheer ridiculousness of anyone trying to narrate explosions of colored gunpowder in real time.

Part of the shock of his death on Monday came from the fact that he had been on — ubiquitous, self-reinventing, insistently present — for so long. On Twitter, mourners dated themselves with memories of the first time they had noticed him. For some it was the movie “Aladdin.” For others “Dead Poets Society” or “Mrs. Doubtfire.” I go back even further, to the “Mork and Mindy” television show and an album called “Reality — What a Concept” that blew my eighth-grade mind.

Back then, it was clear that Mr. Williams was one of the most explosively, exhaustingly, prodigiously verbal comedians who ever lived. The only thing faster than his mouth was his mind, which was capable of breathtaking leaps of free-associative absurdity. Janet Maslin, reviewing his standup act in 1979, cataloged a tumble of riffs that ranged from an impression of Jacques Cousteau to “an evangelist at the Disco Temple of Comedy,” to Truman Capote Jr. at “the Kindergarten of the Stars” (whatever that was). “He acts out the Reader’s Digest condensed version of ‘Roots,’ ” Ms. Maslin wrote, “which lasts 15 seconds in its entirety. He improvises a Shakespearean-sounding epic about the Three Mile Island nuclear disaster, playing all the parts himself, including Einstein’s ghost.” (That, or something like it, was a role he would reprise more than 20 years later in Steven Spielberg’s “A.I.”)

Read the entire article here.

Image courtesy of Google Search.

Gun Love

Gun Violence in America

The second amendment remains ever strong in the U.S. And, of course so does the number of homicides and child deaths at the hands of guns. Sigh!

From the Guardian:

In February, a nine-year-old Arkansas boy called Hank asked his uncle if he could head off on his own from their remote camp to hunt a rabbit with his .22 calibre rifle. “I said all right,” recalled his uncle Brent later. “It wasn’t a concern. Some people are like, ‘a nine year old shouldn’t be off by himself,’ but he wasn’t an average nine year old.”

Hank was steeped in hunting: when he was two, his father, Brad, would put him in a rucksack on his back when he went turkey hunting. Brad regularly took Hank hunting and said that his son often went off hunting by himself. On this particular day, Hank and his uncle Brent had gone squirrel hunting together as his father was too sick to go.

When Hank didn’t return from hunting the rabbit, his uncle raised the alarm. His mother, Kelli, didn’t learn about his disappearance for seven hours. “They didn’t want to bother me unduly,” she says.

The following morning, though, after police, family and hundreds of locals searched around the camp, Hank’s body was found by a creek with a single bullet wound to the forehead. The cause of death was, according to the police, most likely a hunting accident.

“He slipped and the butt of the gun hit the ground and the gun fired,” says Kelli.

Kelli had recently bought the gun for Hank. “It was the first gun I had purchased for my son, just a youth .22 rifle. I never thought it would be a gun that would take his life.”

Both Kelli and Brad, from whom she is separated, believe that the gun was faulty – it shouldn’t have gone off unless the trigger was pulled, they claim. Since Hank’s death, she’s been posting warnings on her Facebook page about the gun her son used: “I wish someone else had posted warnings about it before what happened,” she says.

Had Kelli not bought the gun and had Brad not trained his son to use it, Hank would have celebrated his 10th birthday on 6 June, which his mother commemorated by posting Hank’s picture on her Facebook page with the message: “Happy Birthday Hank! Mommy loves you!”

Little Hank thus became one in a tally of what the makers of a Channel 4 documentary called Kids and Guns claim to be 3,000 American children who die each year from gun-related accidents. A recent Yale University study found that more than 7,000 US children and adolescents are hospitalised or killed by guns each year and estimates that about 20 children a day are treated in US emergency rooms following incidents involving guns.

Hank’s story is striking, certainly for British readers, for two reasons. One, it dramatises how hunting is for many Americans not the privileged pursuit it is overwhelmingly here, but a traditional family activity as much to do with foraging for food as it is a sport.

Francine Shaw, who directed Kids and Guns, says: “In rural America … people hunt to eat.”

Kelli has a fond memory of her son coming home with what he’d shot. “He’d come in and say: “Momma – I’ve got some squirrel to cook.” And I’d say ‘Gee, thanks.’ That child was happy to bring home meat. He was the happiest child when he came in from shooting.”

But Hank’s story is also striking because it shows how raising kids to hunt and shoot is seen as good parenting, perhaps even as an essential part of bringing up children in America – a society rife with guns and temperamentally incapable of overturning the second amendment that confers the right to bear arms, no matter how many innocent Americans die or get maimed as a result.

“People know I was a good mother and loved him dearly,” says Kelli. “We were both really good parents and no one has said anything hateful to us. The only thing that has been said is in a news report about a nine year old being allowed to hunt alone.”

Does Kelli regret that Hank was allowed to hunt alone at that young age? “Obviously I do, because I’ve lost my son,” she tells me. But she doesn’t blame Brent for letting him go off from camp unsupervised with a gun.

“We’re sure not anti-gun here, but do I wish I could go back in time and not buy that gun? Yes I do. I know you in England don’t have guns. I wish I could go back and have my son back. I would live in England, away from the guns.”

Read the entire article here.

Infographic courtesy of Care2 via visua.ly

Measuring a Life

stephen-sutton

“I don’t see the point in measuring life in time any more… I would rather measure it in terms of what I actually achieve. I’d rather measure it in terms of making a difference, which I think is a much more valid and pragmatic measure.”

These are the inspiring and insightful words of 19 year-old, Stephen Sutton, from Birmingham in Britain, about a week before he died from bowel cancer. His upbeat attitude and selflessness during his last days captured the hearts and minds of the nation, and he raised around $5½ million for cancer charities in the process.

From the Guardian:

Few scenarios can seem as cruel or as bleak as a 19-year-old boy dying of cancer. And yet, in the case of Stephen Sutton, who died peacefully in his sleep in the early hours of Wednesday morning, it became an inspiring, uplifting tale for millions of people.

Sutton was already something of a local hero in Birmingham, where he was being treated, but it was an extraordinary Facebook update in April that catapulted him into the national spotlight.

“It’s a final thumbs up from me,” he wrote, accompanied by a selfie of him lying in a sickbed, covered in drips, smiling cheerfully with his thumbs in the air. “I’ve done well to blag things as well as I have up till now, but unfortunately I think this is just one hurdle too far.”

It was an extraordinary moment: many would have forgiven him being full of rage and misery. And yet here was a simple, understated display of cheerful defiance.

Sutton had originally set a fundraising target of £10,000 for the Teenage Cancer Trust. But the emotional impact of that selfie was so profound that, in a matter of days, more than £3m was donated.

He made a temporary recovery that baffled doctors; he explained that he had “coughed up” a tumour. And so began an extraordinary dialogue with his well-wishers.

To his astonishment, nearly a million people liked his Facebook page and tens of thousands followed him on Twitter. It is fashionable to be downbeat about social media: to dismiss it as being riddled with the banal and the narcissistic, or for stripping human interaction of warmth as conversations shift away from the “real world” to the online sphere.

But it was difficult not to be moved by the online response to Stephen’s story: a national wave of emotion that is not normally forthcoming for those outside the world of celebrity.

His social-media updates were relentlessly upbeat, putting those of us who have tweeted moaning about a cold to shame. “Just another update to let everyone know I am still doing and feeling very well,” he reassured followers less than a week before his death. “My disease is very advanced and will get me eventually, but I will try my damn hardest to be here as long as possible.”

Sutton was diagnosed with bowel cancer in September 2010 when he was 15; tragically, he had been misdiagnosed and treated for constipation months earlier.

But his response was unabashed positivity from the very beginning, even describing his diagnosis as a “good thing” and a “kick up the backside”.

The day he began chemotherapy, he attended a party dressed as a granny – he was so thin and pale, he said, that he was “quite convincing”. He refused to take time off school, where he excelled.

When he was diagnosed as terminally ill two years later, he set up a Facebook page with a bucket list of things he wanted to achieve, including sky-diving, crowd-surfing in a rubber dinghy, and hugging an animal bigger than him (an elephant, it turned out).

But it was his fundraising for cancer research that became his passion, and his efforts will undoubtedly transform the lives of some of the 2,200 teenagers and young adults diagnosed with cancer each year.

The Teenage Cancer Trust on Wednesday said it was humbled and hugely grateful for his efforts, with donations still ticking up and reaching £3.34m by mid-afternoon .

His dream had been to become a doctor. With that ambition taken from him, he sought and found new ways to help people. “Spreading positivity” was another key aim. Four days ago, he organised a National Good Gestures Day, in Birmingham, giving out “free high-fives, hugs, handshakes and fist bumps”.

Indeed, it was not just money for cancer research that Sutton was after. He became an evangelist for a new approach to life.

“I don’t see the point in measuring life in time any more,” he told one crowd. “I would rather measure it in terms of what I actually achieve. I’d rather measure it in terms of making a difference, which I think is a much more valid and pragmatic measure.”

By such a measure, Sutton could scarcely have lived a longer, richer and more fulfilling life.

Read the entire story here.

Image: Stephen Sutton. Courtesy of Google Search.

Second Amendment Redux

Retired Justice of the U.S. Supreme Court, John Paul Stevens, argues for a five-word change to the Second Amendment to U.S. Constitution. His cogent argument is set forth in his essay, excerpted below, from his new book, “Six Amendments: How and Why We Should Change the Constitution.”

Stevens’ newly worded paragraph would read as follows:

A well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms when serving in the Militia shall not be infringed.

Sadly, for those of us who advocate gun control, any such change is highly unlikely during our lifetimes, so you can continue to add a further 30,000 annual count of bodies to the gun lobby’s books. The five words should have been inserted 200 years ago. It’s far too late now — and school massacres just aren’t enough to shake the sensibilities of most apathetic or paranoid Americans.

From the Washington Post:

Following the massacre of grammar-school children in Newtown, Conn., in December 2012, high-powered weapons have been used to kill innocent victims in more senseless public incidents. Those killings, however, are only a fragment of the total harm caused by the misuse of firearms. Each year, more than 30,000 people die in the United States in firearm-related incidents. Many of those deaths involve handguns.

The adoption of rules that will lessen the number of those incidents should be a matter of primary concern to both federal and state legislators. Legislatures are in a far better position than judges to assess the wisdom of such rules and to evaluate the costs and benefits that rule changes can be expected to produce. It is those legislators, rather than federal judges, who should make the decisions that will determine what kinds of firearms should be available to private citizens, and when and how they may be used. Constitutional provisions that curtail the legislative power to govern in this area unquestionably do more harm than good.

The first 10 amendments to the Constitution placed limits on the powers of the new federal government. Concern that a national standing army might pose a threat to the security of the separate states led to the adoption of the Second Amendment, which provides that “a well regulated Militia, being necessary to the security of a free State, the right of the people to keep and bear Arms, shall not be infringed.”

For more than 200 years following the adoption of that amendment, federal judges uniformly understood that the right protected by that text was limited in two ways: First, it applied only to keeping and bearing arms for military purposes, and second, while it limited the power of the federal government, it did not impose any limit whatsoever on the power of states or local governments to regulate the ownership or use of firearms. Thus, in United States v. Miller, decided in 1939, the court unanimously held that Congress could prohibit the possession of a sawed-off shotgun because that sort of weapon had no reasonable relation to the preservation or efficiency of a “well regulated Militia.”

When I joined the court in 1975, that holding was generally understood as limiting the scope of the Second Amendment to uses of arms that were related to military activities. During the years when Warren Burger was chief justice, from 1969 to 1986, no judge or justice expressed any doubt about the limited coverage of the amendment, and I cannot recall any judge suggesting that the amendment might place any limit on state authority to do anything.

Organizations such as the National Rifle Association disagreed with that position and mounted a vigorous campaign claiming that federal regulation of the use of firearms severely curtailed Americans’ Second Amendment rights. Five years after his retirement, during a 1991 appearance on “The MacNeil/Lehrer NewsHour,” Burger himself remarked that the Second Amendment “has been the subject of one of the greatest pieces of fraud, I repeat the word ‘fraud,’ on the American public by special interest groups that I have ever seen in my lifetime.”

In recent years two profoundly important changes in the law have occurred. In 2008, by a vote of 5 to 4, the Supreme Court decided in District of Columbia v. Heller that the Second Amendment protects a civilian’s right to keep a handgun in his home for purposes of self-defense. And in 2010, by another vote of 5 to 4, the court decided in McDonald v. Chicago that the due process clause of the 14th Amendment limits the power of the city of Chicago to outlaw the possession of handguns by private citizens. I dissented in both of those cases and remain convinced that both decisions misinterpreted the law and were profoundly unwise. Public policies concerning gun control should be decided by the voters’ elected representatives, not by federal judges.

In my dissent in the McDonald case, I pointed out that the court’s decision was unique in the extent to which the court had exacted a heavy toll “in terms of state sovereignty. . . . Even apart from the States’ long history of firearms regulation and its location at the core of their police powers, this is a quintessential area in which federalism ought to be allowed to flourish without this Court’s meddling. Whether or not we can assert a plausible constitutional basis for intervening, there are powerful reasons why we should not do so.”

“Across the Nation, States and localities vary significantly in the patterns and problems of gun violence they face, as well as in the traditions and cultures of lawful gun use. . . . The city of Chicago, for example, faces a pressing challenge in combating criminal street gangs. Most rural areas do not.”

In response to the massacre of grammar-school students at Sandy Hook Elementary School, some legislators have advocated stringent controls on the sale of assault weapons and more complete background checks on purchasers of firearms. It is important to note that nothing in either the Heller or the McDonald opinion poses any obstacle to the adoption of such preventive measures.

First, the court did not overrule Miller. Instead, it “read Miller to say only that the Second Amendment does not protect those weapons not typically possessed by law-abiding citizens for lawful purposes, such as short-barreled shotguns.” On the preceding page of its opinion, the court made it clear that even though machine guns were useful in warfare in 1939, they were not among the types of weapons protected by the Second Amendment because that protected class was limited to weapons in common use for lawful purposes such as self-defense. Even though a sawed-off shotgun or a machine gun might well be kept at home and be useful for self-defense, neither machine guns nor sawed-off shotguns satisfy the “common use” requirement.

Read the entire article here.

 

 

The Persistent Self

eterni-screenshot

Many of us strive for persistence beyond the realm of our natural life-spans. Some seek to be remembered through monuments, buildings and other physical objects. Others seek permanence through literary and artistic works. Still others aim for remembrance through less lasting, but noble deeds: social programs, health initiatives, charitable foundations and so on. And yet others wish to be preserved in frozen stasis for later thawing and re-awakening. It is safe to say, that many of us would seek to live for ever.

So, it comes as no surprise to see internet startups exploring the market to preserve us or facsimiles of us — digitally — after death. Introducing Eterni.me — your avatar to a virtual eternity.

From Wired (UK):

“We don’t try to replace humans or give false hopes to people grieving.” Romanian design consultant Marius Ursache, cofounder of Eterni.me, needs to clear this up quickly. Because when you’re building a fledgling artificial intelligence company that promises to bring back the dead — or at least, their memories and character, as preserved in their digital footprint — for virtual chats with loved ones, expect a lot of flack.

“It is going to really suck — think Cleverbot with weird out-of-place references to things from that person’s life, masquerading as that person,” wrote one Redditor on the thread “Become Virtually Immortal (In the creepiest way possible)”, which immediately appeared after Eterni.me’s launch was announced last week. Retorts ranged from the bemused — “Now that is some scary f’d up s**t right there. WTF!?” — to the amusing: “Imagine a world where drunk you has to reason with sober AI you before you’re allowed to drunk dial every single person you ever dated or saw naked. So many awkward moments avoided.” But the resounding consensus seems to be that everyone wants to know more.

The site launched with the look of any other Silicon Valley internet startup, but a definitively new take on an old message. While social media companies want you to share and create the story of you while you’re alive, and lifelogging company Memoto promises to capture “meaningful [and shareable] moments”, Eterni.me wants to wrap that all up for those you leave behind into a cohesive AI they can chat with.

Three thousand people registered to the service within the first four days of the site going live, despite there being zero product to make use of (a beta version is slated for 2015). So with a year to ponder your own mortality, why the excitement for a technology that is, at this moment, merely a proof of concept?

“We got very mixed reactions, from ecstatic congratulations to hate mail. And it’s normal — it’s a very polarising topic. But one thing was constant: almost everybody we’ve interacted with truly believes this will be a reality someday. The only question is when it will be a reality and who will make it a reality,” Ursache tells us.

Popular culture and the somewhat innate human need to believe we are impervious, has well prepared us for the concept. Ray Kurzweil wants us to upload our brains to computers and develop synthetic neocortexes, and AI has featured prominently on film and TV for decades, including in this month’s Valentine’s Day release of a human-virtual assistant love story. In series two of British future-focused drama Black Mirror Hayley Atwell reconnects with her diseased lover using a system comparable to what Eterni.me is trying to achieve — though Ursache calls it a “creepier” version, and tells us “we’re trying to stay away from that idea”, the concept that it’s a way for grieving loved ones to stall moving on.

Sigmund Freud called our relationship with the concept of immortality the “real secret of heroism” — that we carry out heroic feats is only down to a perpetual and inherent belief that our consciousness is permanent. He writes in Reflections on War and Death: “We cannot, indeed, imagine our own death; whenever we try to do so we find that we survive ourselves as spectators. The school of psychoanalysis could thus assert that at bottom no one believes in his own death, which amounts to saying: in the unconscious every one of us is convinced of his immortality… Our unconscious therefore does not believe in its own death; it acts as though it were immortal.”

This is why Eterni.me is not just about loved ones signing up after the event, but individuals signing up to have their own character preserved, under their watchful eye while still alive.

The company’s motto is “it’s like a Skype chat from the past,” but it’s still very much about crafting how the world sees you — or remembers you, in this case — just as you might pause and ponder on hitting Facebook’s post button, wondering till the last if your spaghetti dinner photo/comment really gets the right message across. On its more troubling side, the site plays on the fear that you can no longer control your identity after you’re gone; that you are in fact a mere mortal. “The moments and emotions in our lifetime define how we are seen by our family and friends. All these slowly fade away after we die — until one day… we are all forgotten,” it says in its opening lines — scroll down and it provides the answer to all your problems: “Simply Become Immortal”. Part of the reason we might identify as being immortal — at least unconsciously, as Freud describes it — is because we craft a life we believe will be memorable, or have children we believe our legacy will live on in. Eterni.me’s comment shatters that illusion and could be seen as opportunistic on the founders’ part. The site also goes on to promise a “virtual YOU” that can “offer information and advice to your family and friends after you pass away”, a comfort to anyone worried about leaving behind a spouse or children.

In contrast to this rather dramatic claim, Ursache says: “We’re trying to make it clear that it’s not replacing a person, but trying to preserve as much of the information one generates, and offering asynchronous access to it.”

Read the entire article here.

Image: Eterni.me screenshot. Courtesy of Eterni.

Regrets of the Dying

Bronnie Ware, a palliative care nurse chronicled her discussions with those close to death in a thoughtful blog called Inspiration and Chai. Her observations are now an even more thoughtful book, The Top Five Regrets of the Dying. The regrets are simple and yet profound; no mention of wanting to “skydive naked” or “appear on a reality TV show”.

From the Guardian:

Bronnie Ware is an Australian nurse who spent several years working in palliative care, caring for patients in the last 12 weeks of their lives. She recorded their dying epiphanies in a blog called Inspiration and Chai, which gathered so much attention that she put her observations into a book called The Top Five Regrets of the Dying.

Ware writes of the phenomenal clarity of vision that people gain at the end of their lives, and how we might learn from their wisdom. “When questioned about any regrets they had or anything they would do differently,” she says, “common themes surfaced again and again.”

Here are the top five regrets of the dying, as witnessed by Ware:

1. I wish I’d had the courage to live a life true to myself, not the life others expected of me.

“This was the most common regret of all. When people realise that their life is almost over and look back clearly on it, it is easy to see how many dreams have gone unfulfilled. Most people had not honoured even a half of their dreams and had to die knowing that it was due to choices they had made, or not made. Health brings a freedom very few realise, until they no longer have it.”

2. I wish I hadn’t worked so hard.

“This came from every male patient that I nursed. They missed their children’s youth and their partner’s companionship. Women also spoke of this regret, but as most were from an older generation, many of the female patients had not been breadwinners. All of the men I nursed deeply regretted spending so much of their lives on the treadmill of a work existence.”

3. I wish I’d had the courage to express my feelings.

“Many people suppressed their feelings in order to keep peace with others. As a result, they settled for a mediocre existence and never became who they were truly capable of becoming. Many developed illnesses relating to the bitterness and resentment they carried as a result.”

4. I wish I had stayed in touch with my friends.

“Often they would not truly realise the full benefits of old friends until their dying weeks and it was not always possible to track them down. Many had become so caught up in their own lives that they had let golden friendships slip by over the years. There were many deep regrets about not giving friendships the time and effort that they deserved. Everyone misses their friends when they are dying.”

5. I wish that I had let myself be happier.

“This is a surprisingly common one. Many did not realise until the end that happiness is a choice. They had stayed stuck in old patterns and habits. The so-called ‘comfort’ of familiarity overflowed into their emotions, as well as their physical lives. Fear of change had them pretending to others, and to their selves, that they were content, when deep within, they longed to laugh properly and have silliness in their life again.”

Read the entire article here.

Seamus Haney, Come Back

Enough is enough! Our favorite wordsmiths must call a halt right now. First we lost Chris Hitchens, soon followed by Iain Banks. And now, poet extraordinaire, Seamus Heaney.

So, we mourn and celebrate with an excerpt from his 1995 Nobel acceptance speech. You can find more on Heaney’s remarkable life in words, here, at Poetry Foundation.

From the Independent:

When I first encountered the name of the city of Stockholm, I little thought that I would ever visit it, never mind end up being welcomed to it as a guest of the Swedish Academy and the Nobel Foundation.

At the time I am thinking of, such an outcome was not just beyond expectation: it was simply beyond conception. In the nineteen forties, when I was the eldest child of an ever-growing family in rural Co. Derry, we crowded together in the three rooms of a traditional thatched farmstead and lived a kind of den-life which was more or less emotionally and intellectually proofed against the outside world. It was an intimate, physical, creaturely existence in which the night sounds of the horse in the stable beyond one bedroom wall mingled with the sounds of adult conversation from the kitchen beyond the other. We took in everything that was going on, of course – rain in the trees, mice on the ceiling, a steam train rumbling along the railway line one field back from the house – but we took it in as if we were in the doze of hibernation. Ahistorical, pre-sexual, in suspension between the archaic and the modern, we were as susceptible and impressionable as the drinking water that stood in a bucket in our scullery: every time a passing train made the earth shake, the surface of that water used to ripple delicately, concentrically, and in utter silence.

But it was not only the earth that shook for us: the air around and above us was alive and signalling too. When a wind stirred in the beeches, it also stirred an aerial wire attached to the topmost branch of the chestnut tree. Down it swept, in through a hole bored in the corner of the kitchen window, right on into the innards of our wireless set where a little pandemonium of burbles and squeaks would suddenly give way to the voice of a BBC newsreader speaking out of the unexpected like a deus ex machina. And that voice too we could hear in our bedroom, transmitting from beyond and behind the voices of the adults in the kitchen; just as we could often hear, behind and beyond every voice, the frantic, piercing signalling of morse code.

We could pick up the names of neighbours being spoken in the local accents of our parents, and in the resonant English tones of the newsreader the names of bombers and of cities bombed, of war fronts and army divisions, the numbers of planes lost and of prisoners taken, of casualties suffered and advances made; and always, of course, we would pick up too those other, solemn and oddly bracing words, “the enemy” and “the allies”. But even so, none of the news of these world-spasms entered me as terror. If there was something ominous in the newscaster’s tones, there was something torpid about our understanding of what was at stake; and if there was something culpable about such political ignorance in that time and place, there was something positive about the security I inhabited as a result of it.

The wartime, in other words, was pre-reflective time for me. Pre-literate too. Pre-historical in its way. Then as the years went on and my listening became more deliberate, I would climb up on an arm of our big sofa to get my ear closer to the wireless speaker. But it was still not the news that interested me; what I was after was the thrill of story, such as a detective serial about a British special agent called Dick Barton or perhaps a radio adaptation of one of Capt. W.E. Johns’s adventure tales about an RAF flying ace called Biggles. Now that the other children were older and there was so much going on in the kitchen, I had to get close to the actual radio set in order to concentrate my hearing, and in that intent proximity to the dial I grew familiar with the names of foreign stations, with Leipzig and Oslo and Stuttgart and Warsaw and, of course, with Stockholm.

I also got used to hearing short bursts of foreign languages as the dial hand swept round from BBC to Radio Eireann, from the intonations of London to those of Dublin, and even though I did not understand what was being said in those first encounters with the gutturals and sibilants of European speech, I had already begun a journey into the wideness of the world beyond. This in turn became a journey into the wideness of language, a journey where each point of arrival – whether in one’s poetry or one’s life turned out to be a stepping stone rather than a destination, and it is that journey which has brought me now to this honoured spot. And yet the platform here feels more like a space station than a stepping stone, so that is why, for once in my life, I am permitting myself the luxury of walking on air.

*

I credit poetry for making this space-walk possible. I credit it immediately because of a line I wrote fairly recently instructing myself (and whoever else might be listening) to “walk on air against your better judgement”. But I credit it ultimately because poetry can make an order as true to the impact of external reality and as sensitive to the inner laws of the poet’s being as the ripples that rippled in and rippled out across the water in that scullery bucket fifty years ago. An order where we can at last grow up to that which we stored up as we grew. An order which satisfies all that is appetitive in the intelligence and prehensile in the affections. I credit poetry, in other words, both for being itself and for being a help, for making possible a fluid and restorative relationship between the mind’s centre and its circumference, between the child gazing at the word “Stockholm” on the face of the radio dial and the man facing the faces that he meets in Stockholm at this most privileged moment. I credit it because credit is due to it, in our time and in all time, for its truth to life, in every sense of that phrase.

*

To begin with, I wanted that truth to life to possess a concrete reliability, and rejoiced most when the poem seemed most direct, an upfront representation of the world it stood in for or stood up for or stood its ground against. Even as a schoolboy, I loved John Keats’s ode “To Autumn” for being an ark of the covenant between language and sensation; as an adolescent, I loved Gerard Manley Hopkins for the intensity of his exclamations which were also equations for a rapture and an ache I didn’t fully know I knew until I read him; I loved Robert Frost for his farmer’s accuracy and his wily down-to-earthness; and Chaucer too for much the same reasons. Later on I would find a different kind of accuracy, a moral down-to-earthness to which I responded deeply and always will, in the war poetry of Wilfred Owen, a poetry where a New Testament sensibility suffers and absorbs the shock of the new century’s barbarism. Then later again, in the pure consequence of Elizabeth Bishop’s style, in the sheer obduracy of Robert Lowell’s and in the barefaced confrontation of Patrick Kavanagh’s, I encountered further reasons for believing in poetry’s ability – and responsibility – to say what happens, to “pity the planet,” to be “not concerned with Poetry.”

This temperamental disposition towards an art that was earnest and devoted to things as they are was corroborated by the experience of having been born and brought up in Northern Ireland and of having lived with that place even though I have lived out of it for the past quarter of a century. No place in the world prides itself more on its vigilance and realism, no place considers itself more qualified to censure any flourish of rhetoric or extravagance of aspiration. So, partly as a result of having internalized these attitudes through growing up with them, and partly as a result of growing a skin to protect myself against them, I went for years half-avoiding and half- resisting the opulence and extensiveness of poets as different as Wallace Stevens and Rainer Maria Rilke; crediting insufficiently the crystalline inwardness of Emily Dickinson, all those forked lightnings and fissures of association; and missing the visionary strangeness of Eliot. And these more or less costive attitudes were fortified by a refusal to grant the poet any more license than any other citizen; and they were further induced by having to conduct oneself as a poet in a situation of ongoing political violence and public expectation. A public expectation, it has to be said, not of poetry as such but of political positions variously approvable by mutually disapproving groups.

In such circumstances, the mind still longs to repose in what Samuel Johnson once called with superb confidence “the stability of truth”, even as it recognizes the destabilizing nature of its own operations and enquiries. Without needing to be theoretically instructed, consciousness quickly realizes that it is the site of variously contending discourses. The child in the bedroom, listening simultaneously to the domestic idiom of his Irish home and the official idioms of the British broadcaster while picking up from behind both the signals of some other distress, that child was already being schooled for the complexities of his adult predicament, a future where he would have to adjudicate among promptings variously ethical, aesthetical, moral, political, metrical, sceptical, cultural, topical, typical, post-colonial and, taken all together, simply impossible. So it was that I found myself in the mid-nineteen seventies in another small house, this time in Co. Wicklow south of Dublin, with a young family of my own and a slightly less imposing radio set, listening to the rain in the trees and to the news of bombings closer to home-not only those by the Provisional IRA in Belfast but equally atrocious assaults in Dublin by loyalist paramilitaries from the north. Feeling puny in my predicaments as I read about the tragic logic of Osip Mandelstam’s fate in the 1930s, feeling challenged yet steadfast in my noncombatant status when I heard, for example, that one particularly sweetnatured school friend had been interned without trial because he was suspected of having been involved in a political killing. What I was longing for was not quite stability but an active escape from the quicksand of relativism, a way of crediting poetry without anxiety or apology. In a poem called “Exposure” I wrote then:

If I could come on meteorite!
Instead, I walk through damp leaves,
Husks, the spent flukes of autumn,

Imagining a hero
On some muddy compound,
His gift like a slingstone
Whirled for the desperate.

How did I end up like this?
I often think of my friends’
Beautiful prismatic counselling
And the anvil brains of some who hate me

As I sit weighing and weighing
My responsible tristia.
For what? For the ear? For the people?
For what is said behind-backs?

Rain comes down through the alders,
Its low conducive voices
Mutter about let-downs and erosions
And yet each drop recalls

The diamond absolutes.
I am neither internee nor informer;
An inner émigré, a grown long-haired
And thoughtful; a wood-kerne

Escaped from the massacre,
Taking protective colouring
From bole and bark, feeling
Every wind that blows;

Who, blowing up these sparks
For their meagre heat, have missed
The once in a lifetime portent,
The comet’s pulsing rose.
(from North)

Read the entire article here.

Sounds of Extinction

Camera aficionados will find themselves lamenting the demise of the film advance. Now that the world has moved on from film to digital you will no longer hear that distinctive mechanical sound as you wind on the film, and hope the teeth on the spool engage the plastic of the film.

Hardcore computer buffs will no doubt miss the beep-beep-hiss sound of the 56K modem — that now seemingly ancient box that once connected us to… well, who knows what it actually connected us to at that speed.

Our favorite arcane sounds, soon to become relegated to the audio graveyard: the telephone handset slam, the click and carriage return of the typewriter, the whir of reel-to-reel tape, the crackle of the diamond stylus as it first hits an empty groove on a 33.

More sounds you may (or may not) miss below.

From Wired:

The forward march of technology has a drum beat. These days, it’s custom text-message alerts, or your friend saying “OK, Glass” every five minutes like a tech-drunk parrot. And meanwhile, some of the most beloved sounds are falling out of the marching band.

The boops and beeps of bygone technology can be used to chart its evolution. From the zzzzzzap of the Tesla coil to the tap-tap-tap of Morse code being sent via telegraph, what were once the most important nerd sounds in the world are now just historical signposts. But progress marches forward, and for every irritatingly smug Angry Pigs grunt we have to listen to, we move further away from the sound of the Defender ship exploding.

Let’s celebrate the dying cries of technology’s past. The follow sounds are either gone forever, or definitely on their way out. Bow your heads in silence and bid them a fond farewell.

The Telephone Slam

Ending a heated telephone conversation by slamming the receiver down in anger was so incredibly satisfying. There was no better way to punctuate your frustration with the person on the other end of the line. And when that receiver hit the phone, the clack of plastic against plastic was accompanied by a slight ringing of the phone’s internal bell. That’s how you knew you were really pissed — when you slammed the phone so hard, it rang.

There are other sounds we’ll miss from the phone. The busy signal died with the rise of voicemail (although my dad refuses to get voicemail or call waiting, so he’s still OG), and the rapid click-click-click of the dial on a rotary phone is gone. But none of those compare with hanging up the phone with a forceful slam.

Tapping a touchscreen just does not cut it. So the closest thing we have now is throwing the pitifully fragile smartphone against the wall.

The CRT Television

The only TVs left that still use cathode-ray tubes are stashed in the most depressing places — the waiting rooms of hospitals, used car dealerships, and the dusty guest bedroom at your grandparents’ house. But before we all fell prey to the magical resolution of zeros and ones, boxy CRT televisions warmed (literally) the living rooms of every home in America. The sounds they made when you turned them on warmed our hearts, too — the gentle whoosh of the degaussing coil as the set was brought to life with the heavy tug of a pull-switch, or the satisfying mechanical clunk of a power button. As the tube warmed up, you’d see the visuals slowly brighten on the screen, giving you ample time to settle into the couch to enjoy latest episode of Seinfeld.

Read the entire article here.

Image courtesy of Wired.

Seeking Clues to Suicide

Suicide still ranks highly in many cultures as one of the commonest ways to die. The statistics are sobering — in 2012, more U.S. soldiers committed suicide than died in combat. Despite advances in the treatment of mental illness, little has made a dent in the annual increase in the numbers of those who take their lives. Psychologist Matthew Nock hopes to change this through some innovative research.

From the New York Times:

For reasons that have eluded people forever, many of us seem bent on our own destruction. Recently more human beings have been dying by suicide annually than by murder and warfare combined. Despite the progress made by science, medicine and mental-health care in the 20th century — the sequencing of our genome, the advent of antidepressants, the reconsidering of asylums and lobotomies — nothing has been able to drive down the suicide rate in the general population. In the United States, it has held relatively steady since 1942. Worldwide, roughly one million people kill themselves every year. Last year, more active-duty U.S. soldiers killed themselves than died in combat; their suicide rate has been rising since 2004. Last month, the Centers for Disease Control and Prevention announced that the suicide rate among middle-aged Americans has climbed nearly 30 percent since 1999. In response to that widely reported increase, Thomas Frieden, the director of the C.D.C., appeared on PBS NewsHour and advised viewers to cultivate a social life, get treatment for mental-health problems, exercise and consume alcohol in moderation. In essence, he was saying, keep out of those demographic groups with high suicide rates, which include people with a mental illness like a mood disorder, social isolates and substance abusers, as well as elderly white males, young American Indians, residents of the Southwest, adults who suffered abuse as children and people who have guns handy.

But most individuals in every one of those groups never have suicidal thoughts — even fewer act on them — and no data exist to explain the difference between those who will and those who won’t. We also have no way of guessing when — in the next hour? in the next decade? — known risk factors might lead to an attempt. Our understanding of how suicidal thinking progresses, or how to spot and halt it, is little better now than it was two and a half centuries ago, when we first began to consider suicide a medical rather than philosophical problem and physicians prescribed, to ward it off, buckets of cold water thrown at the head.

“We’ve never gone out and observed, as an ecologist would or a biologist would go out and observe the thing you’re interested in for hours and hours and hours and then understand its basic properties and then work from that,” Matthew K. Nock, the director of Harvard University’s Laboratory for Clinical and Developmental Research, told me. “We’ve never done it.”

It was a bright December morning, and we were in his office on the 12th floor of the building that houses the school’s psychology department, a white concrete slab jutting above its neighbors like a watchtower. Below, Cambridge looked like a toy city — gabled roofs and steeples, a ribbon of road, windshields winking in the sun. Nock had just held a meeting with four members of his research team — he in his swivel chair, they on his sofa — about several of the studies they were running. His blue eyes matched his diamond-plaid sweater, and he was neatly shorn and upbeat. He seemed more like a youth soccer coach, which he is on Saturday mornings for his son’s first-grade team, than an expert in self-destruction.

At the meeting, I listened to Nock and his researchers discuss a study they were collaborating on with the Army. They were calling soldiers who had recently attempted suicide and asking them to explain what they had done and why. Nock hoped that sifting through the interview transcripts for repeated phrasings or themes might suggest predictive patterns that he could design tests to catch. A clinical psychologist, he had trained each of his researchers how to ask specific questions over the telephone. Adam Jaroszewski, an earnest 29-year-old in tortoiseshell glasses, told me that he had been nervous about calling subjects in the hospital, where they were still recovering, and probing them about why they tried to end their lives: Why that moment? Why that method? Could anything have happened to make them change their minds? Though the soldiers had volunteered to talk, Jaroszewski worried about the inflections of his voice: how could he put them at ease and sound caring and grateful for their participation without ceding his neutral scientific tone? Nock, he said, told him that what helped him find a balance between empathy and objectivity was picturing Columbo, the frumpy, polite, persistently quizzical TV detective played by Peter Falk. “Just try to be really, really curious,” Nock said.

That curiosity has made Nock, 39, one of the most original and influential suicide researchers in the world. In 2011, he received a MacArthur genius award for inventing new ways to investigate the hidden workings of a behavior that seems as impossible to untangle, empirically, as love or dreams.

Trying to study what people are thinking before they try to kill themselves is like trying to examine a shadow with a flashlight: the minute you spotlight it, it disappears. Researchers can’t ethically induce suicidal thinking in the lab and watch it develop. Uniquely human, it can’t be observed in other species. And it is impossible to interview anyone who has died by suicide. To understand it, psychologists have most often employed two frustratingly imprecise methods: they have investigated the lives of people who have killed themselves, and any notes that may have been left behind, looking for clues to what their thinking might have been, or they have asked people who have attempted suicide to describe their thought processes — though their mental states may differ from those of people whose attempts were lethal and their recollections may be incomplete or inaccurate. Such investigative methods can generate useful statistics and hypotheses about how a suicidal impulse might start and how it travels from thought to action, but that’s not the same as objective evidence about how it unfolds in real time.

Read the entire article here.

Image: 2007 suicide statistics for 15-24 year-olds. Courtesy of Crimson White, UA.

Iain (M.) Banks

On June 9, 2013 we lost Iain Banks to cancer. He was a passionate human(ist) and a literary great.

Luckily he left us with a startling collection of resonant and complex works. Most notably his series of Culture novels that prophesied a distant future, which one day will surely bear his name as a founding member. Mr.Banks, you will be greatly missed.

From the Guardian

The writer Iain Banks, who has died aged 59, had already prepared his many admirers for his death. On 3 April he announced on his website that he had inoperable gall bladder cancer, giving him, at most, a year to live. The announcement was typically candid and rueful. It was also characteristic in another way: Banks had a large web-attentive readership who liked to follow his latest reflections as well as his writings. Particularly in his later years, he frequently projected his thoughts via the internet. There can have been few novelists of recent years who were more aware of what their readers thought of their books; there is a frequent sense in his novels of an author teasing, testing and replying to a readership with which he was pretty familiar.

His first published novel, The Wasp Factory, appeared in 1984, when he was 30 years old, though it had been rejected by six publishers before being accepted by Macmillan. It was an immediate succès de scandale. The narrator is the 16-year-old Frank Cauldhame, who lives with his taciturn father in an isolated house on the north-east coast of Scotland. Frank lives in a world of private rituals, some of which involve torturing animals, and has committed several murders. The explanation of his isolation and his obsessiveness is shockingly revealed in one of the culminating plot twists for which Banks was to become renowned.

It was followed by Walking on Glass (1985), composed of three separate narratives whose connections are deliberately made obscure until near the end of the novel. One of these seems to be a science fiction narrative and points the way to Banks’s strong interest in this genre. Equally, multiple narration would continue to feature in his work.

The next year’s novel, The Bridge, featured three separate stories told in different styles: one a realist narrative about Alex, a manager in an engineering company, who crashes his car on the Forth road bridge; another the story of John Orr, an amnesiac living on a city-sized version of the bridge; and a third, the first-person narrative of the Barbarian, retelling myths and legends in colloquial Scots. In combining fantasy and allegory with minutely located naturalistic narrative, it was clearly influenced by Alasdair Gray’s Lanark (1981). It remained the author’s own avowed favourite.

His first science fiction novel, Consider Phlebas, was published in 1987, though he had drafted it soon after completing The Wasp Factory. In it he created The Culture, a galaxy-hopping society run by powerful but benevolent machines and possessed of what its inventor called “well-armed liberal niceness”. It would feature in most of his subsequent sci-fi novels. Its enemies are the Idirans, a religious, humanoid race who resent the benign powers of the Culture. In this conflict, good and ill are not simply apportioned. Banks provided a heady mix of, on the one hand, action and intrigue on a cosmic scale (his books were often called “space operas”), and, on the other, ruminations on the clash of ideas and ideologies.

For the rest of his career literary novels would alternate with works of science fiction, the latter appearing under the name “Iain M Banks” (the “M” standing for Menzies). Banks sometimes spoke of his science fiction books as a writerly vacation from the demands of literary fiction, where he could “pull out the stops”, as he himself put it. Player of Games (1988) was followed by Use of Weapons (1990). The science fiction employed some of the narrative trickery that characterised his literary fiction: Use of Weapons, for instance, featured two interleaved narratives, one of which moved forward in time and the other backwards. Their connectedness only became clear with a final, somewhat outrageous, twist of the narrative. His many fans came to relish these tricks.

Read the entire article here.

Image: Iain Banks. Courtesy of BBC.

Dead Man Talking

Graham is a man very much alive. But, his mind has convinced him that his brain is dead and that he killed it.

From the New Scientist:

Name: Graham
Condition: Cotard’s syndrome

“When I was in hospital I kept on telling them that the tablets weren’t going to do me any good ’cause my brain was dead. I lost my sense of smell and taste. I didn’t need to eat, or speak, or do anything. I ended up spending time in the graveyard because that was the closest I could get to death.”

Nine years ago, Graham woke up and discovered he was dead.

He was in the grip of Cotard’s syndrome. People with this rare condition believe that they, or parts of their body, no longer exist.

For Graham, it was his brain that was dead, and he believed that he had killed it. Suffering from severe depression, he had tried to commit suicide by taking an electrical appliance with him into the bath.

Eight months later, he told his doctor his brain had died or was, at best, missing. “It’s really hard to explain,” he says. “I just felt like my brain didn’t exist any more. I kept on telling the doctors that the tablets weren’t going to do me any good because I didn’t have a brain. I’d fried it in the bath.”

Doctors found trying to rationalise with Graham was impossible. Even as he sat there talking, breathing – living – he could not accept that his brain was alive. “I just got annoyed. I didn’t know how I could speak or do anything with no brain, but as far as I was concerned I hadn’t got one.”

Baffled, they eventually put him in touch with neurologists Adam Zeman at the University of Exeter, UK, and Steven Laureys at the University of Liège in Belgium.

“It’s the first and only time my secretary has said to me: ‘It’s really important for you to come and speak to this patient because he’s telling me he’s dead,'” says Laureys.

Limbo state

“He was a really unusual patient,” says Zeman. Graham’s belief “was a metaphor for how he felt about the world – his experiences no longer moved him. He felt he was in a limbo state caught between life and death”.

No one knows how common Cotard’s syndrome may be. A study published in 1995 of 349 elderly psychiatric patients in Hong Kong found two with symptoms resembling Cotard’s (General Hospital Psychiatry, DOI: 10.1016/0163-8343(94)00066-M). But with successful and quick treatments for mental states such as depression – the condition from which Cotard’s appears to arise most often – readily available, researchers suspect the syndrome is exceptionally rare today. Most academic work on the syndrome is limited to single case studies like Graham.

Some people with Cotard’s have reportedly died of starvation, believing they no longer needed to eat. Others have attempted to get rid of their body using acid, which they saw as the only way they could free themselves of being the “walking dead”.

Graham’s brother and carers made sure he ate, and looked after him. But it was a joyless existence. “I didn’t want to face people. There was no point,” he says, “I didn’t feel pleasure in anything. I used to idolise my car, but I didn’t go near it. All the things I was interested in went away.”

Even the cigarettes he used to relish no longer gave him a hit. “I lost my sense of smell and my sense of taste. There was no point in eating because I was dead. It was a waste of time speaking as I never had anything to say. I didn’t even really have any thoughts. Everything was meaningless.”

Low metabolism

A peek inside Graham’s brain provided Zeman and Laureys with some explanation. They used positron emission tomography to monitor metabolism across his brain. It was the first PET scan ever taken of a person with Cotard’s. What they found was shocking: metabolic activity across large areas of the frontal and parietal brain regions was so low that it resembled that of someone in a vegetative state.

Graham says he didn’t really have any thoughts about his future during that time. “I had no other option other than to accept the fact that I had no way to actually die. It was a nightmare.”

Graveyard haunt

This feeling prompted him on occasion to visit the local graveyard. “I just felt I might as well stay there. It was the closest I could get to death. The police would come and get me, though, and take me back home.”

There were some unexplained consequences of the disorder. Graham says he used to have “nice hairy legs”. But after he got Cotard’s, all the hairs fell out. “I looked like a plucked chicken! Saves shaving them I suppose…”

It’s nice to hear him joke. Over time, and with a lot of psychotherapy and drug treatment, Graham has gradually improved and is no longer in the grip of the disorder. He is now able to live independently. “His Cotard’s has ebbed away and his capacity to take pleasure in life has returned,” says Zeman.

“I couldn’t say I’m really back to normal, but I feel a lot better now and go out and do things around the house,” says Graham. “I don’t feel that brain-dead any more. Things just feel a bit bizarre sometimes.” And has the experience changed his feeling about death? “I’m not afraid of death,” he says. “But that’s not to do with what happened – we’re all going to die sometime. I’m just lucky to be alive now.”

Read the entire article here.

Image courtesy of Wikimedia / Public domain.

The Digital Afterlife and i-Death

Leave it to Google to help you auto-euthanize and die digitally. The presence of our online selves after death was of limited concern until recently. However, with the explosion of online media and social networks our digital tracks remain preserved and scattered across drives and backups in distributed, anonymous data centers. Physical death does not change this.

[A case in point: your friendly editor at theDiagonal was recently asked to befriend a colleague via LinkedIn. All well and good, except that the colleague had passed-away two years earlier.]

So, armed with Google’s new Inactive Account Manager, death — at least online — may be just a couple of clicks away. By corollary it would be a small leap indeed to imagine an enterprising company charging an annual fee to a dearly-departed member to maintain a digital afterlife ad infinitum.

From the Independent:

The search engine giant Google has announced a new feature designed to allow users to decide what happens to their data after they die.

The feature, which applies to the Google-run email system Gmail as well as Google Plus, YouTube, Picasa and other tools, represents an attempt by the company to be the first to deal with the sensitive issue of data after death.

In a post on the company’s Public Policy Blog Andreas Tuerk, Product Manager, writes: “We hope that this new feature will enable you to plan your digital afterlife – in a way that protects your privacy and security – and make life easier for your loved ones after you’re gone.”

Google says that the new account management tool will allow users to opt to have their data deleted after three, six, nine or 12 months of inactivity. Alternatively users can arrange for certain contacts to be sent data from some or all of their services.

The California-based company did however stress that individuals listed to receive data in the event of ‘inactivity’ would be warned by text or email before the information was sent.

Social Networking site Facebook already has a function that allows friends and family to “memorialize” an account once its owner has died.

Read the entire article following the jump.