Tag Archives: medical

Wound Man

wound-man-wellcome-library-ms-49

No, the image is not a still from a forthcoming episode of Law & Order or Criminal Minds. Nor is it a nightmarish Hieronymus Bosch artwork.

Rather, “Wound Man”, as he was known, is a visual table of contents to a medieval manuscript of medical cures, treatments and surgeries. Wound Man first appeared in German surgical texts in the early 15th century. Arranged around each of his various wounds and ailments are references to further details on appropriate treatments. For instance, reference number 38 alongside an arrow penetrating Wound Man’s thigh, “An arrow whose shaft is still in place”, leads to details on how to address the wound — presumably a relatively common occurrence in the Middle Ages.

From Public Domain Review:

Staring impassively out of the page, he bears a multitude of graphic wounds. His skin is covered in bleeding cuts and lesions, stabbed and sliced by knives, spears and swords of varying sizes, many of which remain in the skin, protruding porcupine-like from his body. Another dagger pierces his side, and through his strangely transparent chest we see its tip puncture his heart. His thighs are pierced with arrows, some intact, some snapped down to just their heads or shafts. A club slams into his shoulder, another into the side of his face.

His neck, armpits and groin sport rounded blue buboes, swollen glands suggesting that the figure has contracted plague. His shins and feet are pockmarked with clustered lacerations and thorn scratches, and he is beset by rabid animals. A dog, snake and scorpion bite at his ankles, a bee stings his elbow, and even inside the cavity of his stomach a toad aggravates his innards.

Despite this horrendous cumulative barrage of injuries, however, the Wound Man is very much alive. For the purpose of this image was not to threaten or inspire fear, but to herald potential cures for all of the depicted maladies. He contrarily represented something altogether more hopeful than his battered body: an arresting reminder of the powerful knowledge that could be channelled and dispensed in the practice of late medieval medicine.

The earliest known versions of the Wound Man appeared at the turn of the fifteenth century in books on the surgical craft, particularly works from southern Germany associated with the renowned Würzburg surgeon Ortolf von Baierland (died before 1339). Accompanying a text known as the “Wundarznei” (The Surgery), these first Wound Men effectively functioned as a human table of contents for the cures contained within the relevant treatise. Look closely at the remarkable Wound Man shown above from the Wellcome Library’s MS. 49 – a miscellany including medical material produced in Germany in about 1420 – and you see that the figure is penetrated not only by weapons but also by text.

Read the entire article here.

Image: The Wound Man. Courtesy: Wellcome Library’s MS. 49 — Source (CC BY 4.0). Public Domain Review.

Time For a New Body, Literally

Brainthatwouldntdie_film_poster

Let me be clear. I’m not referring to a hair transplant, but a head transplant.

A disturbing story has been making the media rounds recently. Dr. Sergio Canavero from the Turin Advanced Neuromodulation Group in Italy, suggests that the time is right to attempt the transplantation of a human head onto a different body. Canavero believes that advances in surgical techniques and immunotherapy are such that a transplantation could be attempted by 2017. Interestingly enough, he has already had several people volunteer for a new body.

Ethics aside, it certainly doesn’t stretch the imagination to believe Hollywood’s elite would clamor for this treatment. Now, I wonder if some people, liking their own body, would want a new head?

From New Scientist:

It’s heady stuff. The world’s first attempt to transplant a human head will be launched this year at a surgical conference in the US. The move is a call to arms to get interested parties together to work towards the surgery.

The idea was first proposed in 2013 by Sergio Canavero of the Turin Advanced Neuromodulation Group in Italy. He wants to use the surgery to extend the lives of people whose muscles and nerves have degenerated or whose organs are riddled with cancer. Now he claims the major hurdles, such as fusing the spinal cord and preventing the body’s immune system from rejecting the head, are surmountable, and the surgery could be ready as early as 2017.

Canavero plans to announce the project at the annual conference of the American Academy of Neurological and Orthopaedic Surgeons (AANOS) in Annapolis, Maryland, in June. Is society ready for such momentous surgery? And does the science even stand up?

The first attempt at a head transplant was carried out on a dog by Soviet surgeon Vladimir Demikhov in 1954. A puppy’s head and forelegs were transplanted onto the back of a larger dog. Demikhov conducted several further attempts but the dogs only survived between two and six days.

The first successful head transplant, in which one head was replaced by another, was carried out in 1970. A team led by Robert White at Case Western Reserve University School of Medicine in Cleveland, Ohio, transplanted the head of one monkey onto the body of another. They didn’t attempt to join the spinal cords, though, so the monkey couldn’t move its body, but it was able to breathe with artificial assistance. The monkey lived for nine days until its immune system rejected the head. Although few head transplants have been carried out since, many of the surgical procedures involved have progressed. “I think we are now at a point when the technical aspects are all feasible,” says Canavero.

This month, he published a summary of the technique he believes will allow doctors to transplant a head onto a new body (Surgical Neurology Internationaldoi.org/2c7). It involves cooling the recipient’s head and the donor body to extend the time their cells can survive without oxygen. The tissue around the neck is dissected and the major blood vessels are linked using tiny tubes, before the spinal cords of each person are cut. Cleanly severing the cords is key, says Canavero.

The recipient’s head is then moved onto the donor body and the two ends of the spinal cord – which resemble two densely packed bundles of spaghetti – are fused together. To achieve this, Canavero intends to flush the area with a chemical called polyethylene glycol, and follow up with several hours of injections of the same stuff. Just like hot water makes dry spaghetti stick together, polyethylene glycol encourages the fat in cell membranes to mesh.

Next, the muscles and blood supply would be sutured and the recipient kept in a coma for three or four weeks to prevent movement. Implanted electrodes would provide regular electrical stimulation to the spinal cord, because research suggests this can strengthen new nerve connections.

When the recipient wakes up, Canavero predicts they would be able to move and feel their face and would speak with the same voice. He says that physiotherapy would enable the person to walk within a year. Several people have already volunteered to get a new body, he says.

The trickiest part will be getting the spinal cords to fuse. Polyethylene glycol has been shown to prompt the growth of spinal cord nerves in animals, and Canavero intends to use brain-dead organ donors to test the technique. However, others are sceptical that this would be enough. “There is no evidence that the connectivity of cord and brain would lead to useful sentient or motor function following head transplantation,” says Richard Borgens, director of the Center for Paralysis Research at Purdue University in West Lafayette, Indiana.

Read the entire article here.

Image: Theatrical poster for the movie The Brain That Wouldn’t Die (1962). Courtesy of Wikipedia.

The Joy of New Technology

prosthetic-hand

We are makers. We humans love to create and invent. Some of our inventions are hideous, laughable or just plain evil — Twinkies, collateralized debt obligations and subprime mortgages, Agent Orange, hair extensions, spray-on tans, cluster bombs, diet water.

However, for every misguided invention comes something truly great. This time, a prosthetic hand that provides a sense of real feeling, courtesy of the makers of the Veterans Affairs Medical Center in Cleveland, Ohio.

From Technology Review:

Igor Spetic’s hand was in a fist when it was severed by a forging hammer three years ago as he made an aluminum jet part at his job. For months afterward, he felt a phantom limb still clenched and throbbing with pain. “Some days it felt just like it did when it got injured,” he recalls.

He soon got a prosthesis. But for amputees like Spetic, these are more tools than limbs. Because the prosthetics can’t convey sensations, people wearing them can’t feel when they have dropped or crushed something.Now Spetic, 48, is getting some of his sensation back through electrodes that have been wired to residual nerves in his arm. Spetic is one of two people in an early trial that takes him from his home in Madison, Ohio, to the Cleveland Veterans Affairs Medical Center. In a basement lab, his prosthetic hand is rigged with force sensors that are plugged into 20 wires protruding from his upper right arm. These lead to three surgically implanted interfaces, seven millimeters long, with as many as eight electrodes apiece encased in a polymer, that surround three major nerves in Spetic’s forearm.

On a table, a nondescript white box of custom electronics does a crucial job: translating information from the sensors on Spetic’s prosthesis into a series of electrical pulses that the interfaces can translate into sensations. This technology “is 20 years in the making,” says the trial’s leader, Dustin Tyler, a professor of biomedical engineering at Case Western Reserve University and an expert in neural interfaces.

As of February, the implants had been in place and performing well in tests for more than a year and a half. Tyler’s group, drawing on years of neuroscience research on the signaling mechanisms that underlie sensation, has developed a library of patterns of electrical pulses to send to the arm nerves, varied in strength and timing. Spetic says that these different stimulus patterns produce distinct and realistic feelings in 20 spots on his prosthetic hand and fingers. The sensations include pressing on a ball bearing, pressing on the tip of a pen, brushing against a cotton ball, and touching sandpaper, he says. A surprising side effect: on the first day of tests, Spetic says, his phantom fist felt open, and after several months the phantom pain was “95 percent gone.”

On this day, Spetic faces a simple challenge: seeing whether he can feel a foam block. He dons a blindfold and noise-­canceling headphones (to make sure he’s relying only on his sense of touch), and then a postdoc holds the block inside his wide-open prosthetic hand and taps him on the shoulder. Spetic closes his prosthesis—a task made possible by existing commercial interfaces to residual arm muscles—and reports the moment he touches the block: success.

Read the entire article here.

Image: Prosthetic hand. Courtesy of MIT Technology Review / Veterans Affairs Medical Center.

Doctor Lobotomy

walter-freeman

Read the following article once and you could be forgiven for assuming that it’s a fictional screenplay for Hollywood’s next R-rated Halloween flick or perhaps the depraved tale of an associate of Nazi SS officer and physician Josef Mengele.

Read the following article twice and you’ll see that the story of neurologist Dr. Walter Freeman is true: the victims — patients — were military veterans numbering in the thousands, and it took place in the United States following WWII.

This awful story is all the more incomprehensible by virtue of the cadre of assistants, surgeons, psychiatrists, do-gooders and government bureaucrats who actively aided Freeman or did nothing to stop his foolish, amateurish experiments. Unbelievable!

From WSJ:

As World War II raged, two Veterans Administration doctors reported witnessing something extraordinary: An eminent neurologist, Walter J. Freeman, and his partner treating a mentally ill patient by cutting open the skull and slicing through neural fibers in the brain.

It was an operation Dr. Freeman called a lobotomy.

Their report landed on the desk of VA chief Frank Hines on July 26, 1943, in the form of a memo recommending lobotomies for veterans with intractable mental illnesses. The operation “may be done, in suitable cases, under local anesthesia,” the memo said. It “does not demand a high degree of surgical skill.”

The next day Mr. Hines stamped the memo in purple ink: APPROVED.

Over the next dozen or so years, the U.S. government would lobotomize roughly 2,000 American veterans, according to a cache of forgotten VA documents unearthed by The Wall Street Journal, including the memo approved by Mr. Hines. It was a decision made “in accord with our desire to keep abreast of all advances in treatment,” the memo said.

The 1943 decision gave birth to an alliance between the VA and lobotomy’s most dogged salesman, Dr. Freeman, a man famous in his day and notorious in retrospect. His prolific—some critics say reckless—use of brain surgery to treat mental illness places him today among the most controversial figures in American medical history.

At the VA, Dr. Freeman pushed the frontiers of ethically acceptable medicine. He said VA psychiatrists, untrained in surgery, should be allowed to perform lobotomies by hammering ice-pick-like tools through patients’ eye sockets. And he argued that, while their patients’ skulls were open anyway, VA surgeons should be permitted to remove samples of living brain for research purposes.

The documents reveal the degree to which the VA was swayed by his pitch. The Journal this week is reporting the first detailed account of the VA’s psychosurgery program based on records in the National Archives, Dr. Freeman’s own papers at George Washington University, military documents and medical records, as well as interviews with doctors from the era, families of lobotomized vets and one surviving patient, 90-year-old Roman Tritz.

The agency’s use of lobotomy tailed off when the first major antipsychotic drug, Thorazine, came on the market in the mid-1950s, and public opinion of Dr. Freeman and his signature surgery pivoted from admiration to horror.

During and immediately after World War II, lobotomies weren’t greeted with the dismay they prompt today. Still, Dr. Freeman’s views sparked a heated debate inside the agency about the wisdom and ethics of an operation Dr. Freeman himself described as “a surgically induced childhood.”

In 1948, one senior VA psychiatrist wrote a memo mocking Dr. Freeman for using lobotomies to treat “practically everything from delinquency to a pain in the neck.” Other doctors urged more research before forging ahead with such a dramatic medical intervention. A number objected in particular to the Freeman ice-pick technique.

Yet Dr. Freeman’s influence proved decisive. The agency brought Dr. Freeman and his junior partner, neurosurgeon James Watts, aboard as consultants, speakers and inspirations, and its doctors performed lobotomies on veterans at some 50 hospitals from Massachusetts to Oregon.

Born in 1895 to a family of Philadelphia doctors, Yale-educated Dr. Freeman was drawn to psychosurgery by his work in the wards of St. Elizabeth’s Hospital, where Washington’s mentally ill, including World War I veterans, were housed but rarely cured. The treatments of the day—psychotherapy, electroshock, high-pressure water sprays and insulin injections to induce temporary comas—wouldn’t successfully cure serious mental illnesses that resulted from physical defects in the brain, Dr. Freeman believed. His suggestion was to sever faulty neural pathways between the prefrontal area and the rest of the brain, channels believed by lobotomy practitioners to promote excessive emotions.

It was an approach pioneered by Egas Moniz, a Portuguese physician who in 1935 performed the first lobotomy (then called a leucotomy). Fourteen years later, he was rewarded with the Nobel Prize in medicine.

In 1936, Drs. Freeman and Watts performed their first lobotomy, on a 63-year-old woman suffering from depression, anxiety and insomnia. “I knew as soon as I operated on a mental patient and cut into a physically normal brain, I’d be considered radical by some people,” Dr. Watts said in a 1979 interview transcribed in the George Washington University archives.

By his own count, Dr. Freeman would eventually participate in 3,500 lobotomies, some, according to records in the university archives, on children as young as four years old.

“In my father’s hands, the operation worked,” says his son, Walter Freeman III, a retired professor of neurobiology. “This was an explanation for his zeal.”

Drs. Freeman and Watts considered about one-third of their operations successes in which the patient was able to lead a “productive life,” Dr. Freeman’s son says. Another third were able to return home but not support themselves. The final third were “failures,” according to Dr. Watts.

Later in life, Dr. Watts, who died in 1994, offered a blunt assessment of lobotomy’s heyday. “It’s a brain-damaging operation. It changes the personality,” he said in the 1979 interview. “We could predict relief, and we could fairly accurately predict relief of certain symptoms like suicidal ideas, attempts to kill oneself. We could predict there would be relief of anxiety and emotional tension. But we could not nearly as accurately predict what kind of person this was going to be.”

Other possible side-effects included seizures, incontinence, emotional outbursts and, on occasion, death.

Read the entire article here.

 

Britain’s Genomics NHS

The United Kingdom is plotting a visionary strategy that will put its treasured National Health Service (NHS) at the heart of the new revolution in genomics-based medical care.

From Technology Review:

By sequencing the genomes of 100,000 patients and integrating the resulting data into medical care, the U.K. could become the first country to introduce genome sequencing into its mainstream health system. The U.K. government hopes that the investment will improve patient outcomes while also building a genomic medicine industry. But the project will test the practical challenges of integrating and safeguarding genomic data within an expansive health service.

Officials breathed life into the ambitious sequencing project in June when they announced the formation of Genomics England, a company set up to execute the £100 million project. The goal is to “transform how the NHS uses genomic medicine,” says the company’s chief scientist, Mark Caulfield.

Those changes will take many shapes. First, by providing whole-genome sequencing and analysis for National Health Service patients with rare diseases, Genomics England could help families understand the origin of these conditions and help doctors better treat them. Second, the company will sequence the genomes of cancer patients and their tumors, which could help doctors identify the best drugs to treat the disease. Finally, say leaders of the 100,000 genomes project, the efforts could uncover the basis for bacterial and viral resistance to medicines.

“We hope that the legacy at the end of 2017, when we conclude the 100,000 whole-genome sequences, will be a transformed capacity and capability in the NHS to use this data,” says Caulfield.

In the last few years, the cost and time required to sequence DNA have plummeted (see “Bases to Bytes”), making the technology more feasible to use as part of clinical care. Governments around the world are investing in large-scale projects to identify the best way to harness genome technology in a medical setting. For example, the Faroe Islands, a sovereign state within the Kingdom of Denmark, is offering sequencing to all of its citizens to understand the basis of genetic diseases prevalent in the isolated population. The U.S. has funded several large grants to study how to best use medical genomic data, and in 2011 it announced an effort to sequence thousands of veterans’ genomes. In 1999, the Chinese government helped establish the Beijing Genomics Institute, which would later become the world’s most prolific genome institute, providing sequences for projects based in China and abroad (see “Inside China’s Genome Factory”).

But the U.K. project stands out for the large number of genomes planned and the integration of the data into a national health-care system that serves more than 60 million people. The initial program will focus on rare inherited diseases, cancer, and infectious pathogens. Initially, the greatest potential will be in giving families long-sought-after answers as to why a rare disorder afflicts them or their children, and “in 10 or 20 years, there may be treatments sprung from it,” says Caulfield.

In addition to exploring how to best handle and use genomic data, the projects taking place in 2014 will give Genomics England time to explore different sequencing technologies offered by commercial providers. The San Diego-based sequencing company Illumina will provide sequencing at existing facilities in England, but Caulfeld emphasizes that the project will want to use the sequencing services of multiple commercial providers. “We are keen to encourage competitiveness in this marketplace as a route to bring down the price for everybody.”

To help control costs for the lofty project, and to foster investment in genomic medicine in the U.K., Genomics England will ask commercial providers to set up sequencing centers in England. “Part of this program is to generate wealth, and that means U.K. jobs,” he says. “We want the sequencing providers to invest in the U.K.” The sequencing centers will be ready by 2015, when the project kicks off in earnest. “Then we will be sequencing 30,000 whole-genome sequences a year,” says Caulfield.

Read the entire article here.

Image: Argonne’s Midwest Center for Structural Genomics deposits 1,000th protein structure. Courtesy of Wikipedia.

Chameleon Syringes

How does a design aesthetic save lives? It’s simpler than you might think. Take a basic medical syringe, add a twist of color-change technology, borrowed from the design world, and you get a device that can save 1.3 million lives each year.

From the Guardian:

You might not want to hear this, but there’s a good reason to be scared of needles: the most deadly clinical procedure in the world is a simple injection.

Every year, 1.3 million deaths are caused by unsafe injections, due to the reuse of syringes. The World Health Organisation (WHO) estimates that up to 40% of the 40bn injections administered annually are delivered with syringes that have been reused without sterilisation, causing over 30% of hepatitis B and C cases and 5% of HIV cases – statistics that have put the problem at number five on the WHO priority list.

It is a call to arms that stirred Dr David Swann, reader in design at the University of Huddersfield, into action, to develop what he describes as a “behaviour-changing syringe” that would warn patients when the needle was unsafe – a design that is now in the running for the Index design awards.

“The difficulty for patients is that it is impossible to determine a visual difference between a used syringe that has been washed and a sterile syringe removed from its packaging,” says Swann. “Instigating a colour change would explicitly expose the risk and could indicate prior use without doubt.”

Keen to keep the price down to ensure accessibility, Swann turned to cheap technologies used in the food industry, using inks that react to carbon dioxide and packaging the syringes in nitrogen-filled packets – just the same as a bag of crisps. Once opened and exposed to the air, the syringe has a 60-second treatment window before turning bright red, while a faceted barrel design means that the piston will break if someone tries to replace it. Remarkably, the ABCs (A Behaviour Changing Syringe) cost only 0.16p more than a typical 2.5p disposable syringe.

Swann is trialling the product in India, as the country is the largest consumer of syringes in the world, accounting for 83% of all injections – over 60% of which are deemed unsafe, and 30% of which transmit a disease in some form, according to the WHO.

“There are landfill scavengers searching piles of waste for syringe devices that are then sold on to medical establishments,” says Swann. “We want to break that cycle.” He estimates that after five years, the ABCs will have prevented 700,000 unsafe injections, saved 6.5 million life years and saved $130m in medical costs in India alone.

Colour-changing technology is increasingly finding medical applications, as designers look to transfer innovations in reactive ink towards potentially lifesaving ends. Husband and wife doctor/designer duo Gautam and Kanupriya Goel have developed a form of packaging for medicine that gradually changes its pattern as the product expires.

Read the entire article here.

Image: Red for danger — ABCs syringe. Courtesy of David Swann / Guardian.

Building a Liver

In yet another breakthrough for medical science, researchers have succeeded in growing a prototypical human liver in the lab.

From the New York Times:

Researchers in Japan have used human stem cells to create tiny human livers like those that arise early in fetal life. When the scientists transplanted the rudimentary livers into mice, the little organs grew, made human liver proteins, and metabolized drugs as human livers do.

They and others caution that these are early days and this is still very much basic research. The liver buds, as they are called, did not turn into complete livers, and the method would have to be scaled up enormously to make enough replacement liver buds to treat a patient. Even then, the investigators say, they expect to replace only 30 percent of a patient’s liver. What they are making is more like a patch than a full liver.

But the promise, in a field that has seen a great deal of dashed hopes, is immense, medical experts said.

“This is a major breakthrough of monumental significance,” said Dr. Hillel Tobias, director of transplantation at the New York University School of Medicine. Dr. Tobias is chairman of the American Liver Foundation’s national medical advisory committee.

“Very impressive,” said Eric Lagasse of the University of Pittsburgh, who studies cell transplantation and liver disease. “It’s novel and very exciting.”

The study was published on Wednesday in the journal Nature.

Although human studies are years away, said Dr. Leonard Zon, director of the stem cell research program at Boston Children’s Hospital, this, to his knowledge, is the first time anyone has used human stem cells, created from human skin cells, to make a functioning solid organ, like a liver, as opposed to bone marrow, a jellylike organ.

Ever since they discovered how to get human stem cells — first from embryos and now, more often, from skin cells — researchers have dreamed of using the cells for replacement tissues and organs. The stem cells can turn into any type of human cell, and so it seemed logical to simply turn them into liver cells, for example, and add them to livers to fill in dead or damaged areas.

But those studies did not succeed. Liver cells did not take up residence in the liver; they did not develop blood supplies or signaling systems. They were not a cure for disease.

Other researchers tried making livers or other organs by growing cells on scaffolds. But that did not work well either. Cells would fall off the scaffolds and die, and the result was never a functioning solid organ.

Researchers have made specialized human cells in petri dishes, but not three-dimensional structures, like a liver.

The investigators, led by Dr. Takanori Takebe of the Yokohama City University Graduate School of Medicine, began with human skin cells, turning them into stem cells. By adding various stimulators and drivers of cell growth, they then turned the stem cells into human liver cells and began trying to make replacement livers.

They say they stumbled upon their solution. When they grew the human liver cells in petri dishes along with blood vessel cells from human umbilical cords and human connective tissue, that mix of cells, to their surprise, spontaneously assembled itself into three-dimensional liver buds, resembling the liver at about five or six weeks of gestation in humans.

Then the researchers transplanted the liver buds into mice, putting them in two places: on the brain and into the abdomen. The brain site allowed them to watch the buds grow. The investigators covered the hole in each animal’s skull with transparent plastic, giving them a direct view of the developing liver buds. The buds grew and developed blood supplies, attaching themselves to the blood vessels of the mice.

The abdominal site allowed them to put more buds in — 12 buds in each of two places in the abdomen, compared with one bud in the brain — which let the investigators ask if the liver buds were functioning like human livers.

They were. They made human liver proteins and also metabolized drugs that human livers — but not mouse livers — metabolize.

The approach makes sense, said Kenneth Zaret, a professor of cellular and developmental biology at the University of Pennsylvania. His research helped establish that blood and connective tissue cells promote dramatic liver growth early in development and help livers establish their own blood supply. On their own, without those other types of cells, liver cells do not develop or form organs.

Read the entire article here.

Image: Diagram of the human liver. Courtesy of Encyclopedia Britannica.

Childhood Injuries on the Rise: Blame Parental Texting

The long-term downward trend in the number injuries to young children is no longer. Sadly, urgent care and emergency room doctors are now seeing more children aged 0-14 years with unintentional injuries. While the exact causes are yet to be determined, there is a growing body of anecdotal evidence that points to distraction among patents and supervisors — it’s the texting stupid!

The great irony is that should your child suffer an injury while you were using your smartphone, you’ll be able to contact the emergency room much more quickly now — courtesy of the very same smartphone.

[div class=attrib]From the Wall Street Journal:[end-div]

One sunny July afternoon in a San Francisco park, tech recruiter Phil Tirapelle was tapping away on his cellphone while walking with his 18-month-old son. As he was texting his wife, his son wandered off in front of a policeman who was breaking up a domestic dispute.

“I was looking down at my mobile, and the police officer was looking forward,” and his son “almost got trampled over,” he says. “One thing I learned is that multitasking makes you dumber.”

Yet a few minutes after the incident, he still had his phone out. “I’m a hypocrite. I admit it,” he says. “We all are.”

Is high-tech gadgetry diminishing the ability of adults to give proper supervision to very young children? Faced with an unending litany of newly proclaimed threats to their kids, harried parents might well roll their eyes at this suggestion. But many emergency-room doctors are worried: They see the growing use of hand-held electronic devices as a plausible explanation for the surprising reversal of a long slide in injury rates for young children. There have even been a few extreme cases of death and near drowning.

Nonfatal injuries to children under age five rose 12% between 2007 and 2010, after falling for much of the prior decade, according to the most recent data from the Centers for Disease Control and Prevention, based on emergency-room records. The number of Americans 13 and older who own a smartphone such as an iPhone or BlackBerry has grown from almost 9 million in mid-2007, when Apple introduced its device, to 63 million at the end of 2010 and 114 million in July 2012, according to research firm comScore.

Child-safety experts say injury rates had been declining since at least the 1970s, thanks to everything from safer playgrounds to baby gates on staircases to fences around backyard swimming pools. “It was something we were always fairly proud of,” says Dr. Jeffrey Weiss, a pediatrician at Phoenix Children’s Hospital who serves on an American Academy of Pediatrics working group for injury, violence and poison prevention. “The injuries were going down and down and down.” The recent uptick, he says, is “pretty striking.”

Childhood-injury specialists say there appear to be no formal studies or statistics to establish a connection between so-called device distraction and childhood injury. “What you have is an association,” says Dr. Gary Smith, founder and director of the Center for Injury Research and Policy of the Research Institute at Nationwide Children’s Hospital. “Being able to prove causality is the issue…. It certainly is a question that begs to be asked.”

It is well established that using a smartphone while driving or even crossing a street increases the risk of accident. More than a dozen pediatricians, emergency-room physicians, academic researchers and police interviewed by The Wall Street Journal say that a similar factor could be at play in injuries to young children.

“It’s very well understood within the emergency-medicine community that utilizing devices—hand-held devices—while you are assigned to watch your kids—that resulting injuries could very well be because you are utilizing those tools,” says Dr. Wally Ghurabi, medical director of the emergency center at the Santa Monica-UCLA Medical Center and Orthopaedic Hospital.

Adds Dr. Rahul Rastogi, an emergency-room physician at Kaiser Permanente in Oregon: “We think we’re multitasking and not really feeling like we are truly distracted. But in reality we are.”

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of Science Daily.[end-div]

The Nation’s $360 Billion Medical Bill

The United States spends around $2.5 trillion per year on health care. Approximately 14 percent of this is administrative spending. That’s $360 billion, yes, billion with a ‘b’, annually. And, by all accounts a significant proportion of this huge sum is duplicate, redundant, wasteful and unnecessary spending — that’s a lot of paperwork.

[div class=attrib]From the New York Times:[end-div]

 

LAST year I had to have a minor biopsy. Every time I went in for an appointment, I had to fill out a form requiring my name, address, insurance information, emergency contact person, vaccination history, previous surgical history and current medical problems, medications and allergies. I must have done it four times in just three days. Then, after my procedure, I received bills — and, even more annoying, statements of charges that said they weren’t bills — almost daily, from the hospital, the surgeon, the primary care doctor, the insurance company.

Imagine that repeated millions of times daily and you have one of the biggest money wasters in our health care system. Administration accounts for roughly 14 percent of what the United States spends on health care, or about $360 billion per year. About half of all administrative costs — $163 billion in 2009 — are borne by Medicare, Medicaid and insurance companies. The other half pays for the legions employed by doctors and hospitals to fill out billing forms, keep records, apply for credentials and perform the myriad other administrative functions associated with health care.

The range of expert opinions on how much of this could be saved goes as high as $180 billion, or half of current expenditures. But a more conservative and reasonable estimate comes from David Cutler, an economist at Harvard, who calculates that for the whole system — for insurers as well as doctors and hospitals — electronic billing and credentialing could save $32 billion a year. And United Health comes to a similar estimate, with 20 percent of savings going to the government, 50 percent to physicians and hospitals and 30 percent to insurers. For health care cuts to matter, they have to be above 1 percent of total costs, or $26 billion a year, and this conservative estimate certainly meets that threshold.

How do we get to these savings? First, electronic health records would eliminate the need to fill out the same forms over and over. An electronic credentialing system shared by all hospitals, insurance companies, Medicare, Medicaid, state licensing boards and other government agencies, like the Drug Enforcement Administration, could reduce much of the paperwork doctors are responsible for that patients never see. Requiring all parties to use electronic health records and an online system for physician credentialing would reduce frustration and save billions.

But the real savings is in billing. There are at least six steps in the process: 1) determining a patient’s eligibility for services; 2) obtaining prior authorization for specialist visits, tests and treatments; 3) submitting claims by doctors and hospitals to insurers; 4) verifying whether a claim was received and where in the process it is; 5) adjudicating denials of claims; and 6) receiving payment.

Substantial costs arise from the fact that doctors, hospitals and other care providers must bill multiple insurance companies. Instead of having a unified electronic billing system in which a patient could simply swipe an A.T.M.-like card for automatic verification of eligibility, claims processing and payment, we have a complicated system with lots of expensive manual data entry that produces costly mistakes.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]Image: Piles of paperwork. Courtesy of the Guardian.[end-div]