Category Archives: Idea Soup

How the World May End: Science Versus Brimstone

Every couple of years a (hell)fire and brimstone preacher floats into the national consciousness and makes the headlines with certain predictions from the book regarding imminent destruction of our species and home. Most recently Harold Camping, the radio evangelist, predicted the apocalypse would begin on Saturday, May 21, 2011. His subsequent revision placed the “correct date” at October 21, 2011. Well, we’re still here, so the next apocalyptic date to prepare for, according to watchers of all things Mayan, is December 21, 2012.

So not to be outdone by prophesy from one particular religion or another, science has come out swinging with its own list of potential apocalyptic end-of-days. No surprise, many scenarios may well be at our own hands.

[div class=attrib]From the Guardian:[end-div]

Stories of brimstone, fire and gods make good tales and do a decent job of stirring up the requisite fear and jeopardy. But made-up doomsday tales pale into nothing, creatively speaking, when contrasted with what is actually possible. Look through the lens of science and “the end” becomes much more interesting.

Since the beginning of life on Earth, around 3.5 billion years ago, the fragile existence has lived in the shadow of annihilation. On this planet, extinction is the norm – of the 4 billion species ever thought to have evolved, 99% have become extinct. In particular, five times in this past 500 million years the steady background rate of extinction has shot up for a period of time. Something – no one knows for sure what – turned the Earth into exactly the wrong planet for life at these points and during each mass extinction, more than 75% of the existing species died off in a period of time that was, geologically speaking, a blink of the eye.

One or more of these mass extinctions occurred because of what we could call the big, Hollywood-style, potential doomsday scenarios. If a big enough asteroid hit the Earth, for example, the impact would cause huge earthquakes and tsunamis that could cross the globe. There would be enough dust thrown into the air to block out the sun for several years. As a result, the world’s food resources would be destroyed, leading to famine. It has happened before: the dinosaurs (along with more than half the other species on Earth) were wiped out 65 million years ago by a 10km-wide asteroid that smashed into the area around Mexico.

Other natural disasters include sudden changes in climate or immense volcanic eruptions. All of these could cause global catastrophes that would wipe out large portions of the planet’s life, but, given we have survived for several hundreds of thousands of years while at risk of these, it is unlikely that a natural disaster such as that will cause catastrophe in the next few centuries.

In addition, cosmic threats to our existence have always been with us, even thought it has taken us some time to notice: the collision of our galaxy, the Milky Way, with our nearest neighbour, Andromeda, for example, or the arrival of a black hole. Common to all of these threats is that there is very little we can do about them even when we know the danger exists, except trying to work out how to survive the aftermath.

But in reality, the most serious risks for humans might come from our own activities. Our species has the unique ability in the history of life on Earth to be the first capable of remaking our world. But we can also destroy it.

All too real are the human-caused threats born of climate change, excess pollution, depletion of natural resources and the madness of nuclear weapons. We tinker with our genes and atoms at our own peril. Nanotechnology, synthetic biology and genetic modification offer much potential in giving us better food to eat, safer drugs and a cleaner world, but they could also go wrong if misapplied or if we charge on without due care.

Some strange ways to go and their corresponding danger signs listed below:

DEATH BY EUPHORIA

Many of us use drugs such as caffeine or nicotine every day. Our increased understanding of physiology brings new drugs that can lift mood, improve alertness or keep you awake for days. How long before we use so many drugs we are no longer in control? Perhaps the end of society will not come with a bang, but fade away in a haze.

Danger sign: Drugs would get too cheap to meter, but you might be too doped up to notice.

VACUUM DECAY

If the Earth exists in a region of space known as a false vacuum, it could collapse into a lower-energy state at any point. This collapse would grow at the speed of light and our atoms would not hold together in the ensuing wave of intense energy – everything would be torn apart.

Danger sign: There would be no signs. It could happen half way through this…

STRANGELETS

Quantum mechanics contains lots of frightening possibilities. Among them is a particle called a strangelet that can transform any other particle into a copy of itself. In just a few hours, a small chunk of these could turn a planet into a featureless mass of strangelets. Everything that planet was would be no more.

Danger sign: Everything around you starts cooking, releasing heat.

END OF TIME

What if time itself somehow came to a finish because of the laws of physics? In 2007, Spanish scientists proposed an alternative explanation for the mysterious dark energy that accounts for 75% of the mass of the universe and acts as a sort of anti-gravity, pushing galaxies apart. They proposed that the effects we observe are due to time slowing down as it leaked away from our universe.

Danger sign: It could be happening right now. We would never know.

MEGA TSUNAMI

Geologists worry that a future volcanic eruption at La Palma in the Canary Islands might dislodge a chunk of rock twice the volume of the Isle of Man into the Atlantic Ocean, triggering waves a kilometre high that would move at the speed of a jumbo jet with catastrophic effects for the shores of the US, Europe, South America and Africa.

Danger sign: Half the world’s major cities are under water. All at once.

GEOMAGNETIC REVERSAL

The Earth’s magnetic field provides a shield against harmful radiation from our sun that could rip through DNA and overload the world’s electrical systems. Every so often, Earth’s north and south poles switch positions and, during the transition, the magnetic field will weaken or disappear for many years. The last known transition happened almost 780,000 years ago and it is likely to happen again.

Danger sign: Electronics stop working.

GAMMA RAYS FROM SPACE

When a supermassive star is in its dying moments, it shoots out two beams of high-energy gamma rays into space. If these were to hit Earth, the immense energy would tear apart the atmosphere’s air molecules and disintegrate the protective ozone layer.

Danger sign: The sky turns brown and all life on the surface slowly dies.

RUNAWAY BLACK HOLE

Black holes are the most powerful gravitational objects in the universe, capable of tearing Earth into its constituent atoms. Even within a billion miles, a black hole could knock Earth out of the solar system, leaving our planet wandering through deep space without a source of energy.

Danger sign: Increased asteroid activity; the seasons get really extreme.

INVASIVE SPECIES

Invasive species are plants, animals or microbes that turn up in an ecosystem that has no protection against them. The invader’s population surges and the ecosystem quickly destabilises towards collapse. Invasive species are already an expensive global problem: they disrupt local ecosystems, transfer viruses, poison soils and damage agriculture.

Danger sign: Your local species disappear.

TRANSHUMANISM

What if biological and technological enhancements took humans to a level where they radically surpassed anything we know today? “Posthumans” might consist of artificial intelligences based on the thoughts and memories of ancient humans, who uploaded themselves into a computer and exist only as digital information on superfast computer networks. Their physical bodies might be gone but they could access and store endless information and share their thoughts and feelings immediately and unambiguously with other digital humans.

Danger sign: You are outcompeted, mentally and physically, by a cyborg.

[div class=attrib]Read more of this article here.[end-div]

[div class=attrib]End is Nigh Sign. Courtesy of frontporchrepublic.com.[end-div]

What Exactly is a Person?

The recent “personhood” amendment on the ballot in Mississippi has caused many to scratch their heads and ponder the meaning of “person”. Philosophers through the ages have tackled this thorny question with detailed treatises and little consensus.

Boethius suggested that a person is “the individual substance of a rational nature.” Descartes described a person as an agent, human or otherwise, possessing consciousness, and capable of creating and acting on a plan. John Locke extended this definition to include reason and reflection. Kant looked at a person as a being having a conceptualizing mind capable of purposeful thought. Charles Taylor takes this naturalistic view further, defining a person as an agent driven by matters of significance. Harry Frankfurt characterized as person as an entity enshrining free will driven by a hierarchy of desires. Still others provide their own definition of a person. Peter Singer offers self-awareness as a distinguishing trait; Thomas White suggests that a person has the following elements: is alive, is aware, feels sensations, has emotions, has a sense of self, controls its own behaviour, recognises other persons, and has a various cognitive abilities.

Despite the variation in positions, all would seem to agree that a fertilized egg is certainly not a person.

    [div class=attrib]A thoughtful take over at 13.7 Cosmos and Culture blog:[end-div]

    According to Catholic doctrine, the Father, the Son and Holy Spirit are three distinct persons even though they are one essence. Only one of those persons — Jesus Christ — is also a human being whose life had a beginning and an end.

    I am not an expert in Trinitarian theology. But I mention it here because, great mysteries aside, this Catholic doctrine uses the notion of person in what, from our point of view today, is the standard way.

    John Locke called person a forensic concept. What he had in mind is that a person is one to whom credit and blame may be attached, one who is deemed responsible. The concept of a person is the concept of an agent.

    Crucially, Locke argued, persons are not the same as human beings. Dr. Jekyl and Mr. Hyde may be one and the same human being, that is, one and the same continuously existing organic life; they share a birth event; but they are two distinct persons. And this is why we don’t blame the one for the other’s crimes. Multiple personality disorder might be a real world example of this.

    I don’t know whether Locke believed that two distinct persons could actually inhabit the same living human body, but he certainly thought there was nothing contradictory in the possibility. Nor did he think there was anything incoherent in the thought that one person could find existence in multiple distinct animal lives, even if, as a matter of fact, this may not be possible. If you believe in reincarnation, then you think this is a genuine possibility. For Locke, this was no more incoherent than the idea of two actors playing the same role in a play.

    Indeed, the word “person” derives from a Latin (and originally a Greek) word meaning “character in a drama” or “mask” (because actors wore masks). This usage survives today in the phrase “dramatis personae.” To be a person, from this standpoint, is to play a role. The person is the role played, however, not the player.

    From this standpoint, the idea of non-human, non-living person certainly makes sense, even if we find it disturbing. Corporations are persons under current law, and this makes sense. They are actors, after all, and we credit and blame them for the things they do. They play an important role in our society.

    [div class=attrib]Read the whole article here.[end-div]

    [div class=attrib]Image: Abstract painting of a person, titled WI (In Memoriam), by Paul Klee (1879–1940). Courtesy of Wikipedia.[end-div]

    Supercommittee and Innovation: Oxymoron Du Jour

    Today is deadline day for the U.S. Congressional Select Committee on Deficit Reduction to deliver. Perhaps, a little ironically the committee was commonly mistitled the “Super Committee”. Interestingly, pundits and public alike do not expect the committee to deliver any significant, long-term solution to the United States’ fiscal problems. In fact, many do not believe the committee with deliver anything at all beyond reinforcement of right- and left-leaning ideologies, political posturing, pandering to special interests of all colors and, of course, recriminations and spin.

    Could the Founders have had such dysfunction in mind when they designed the branches of government with its many checks and balances to guard against excess and tyranny. So, perhaps it’s finally time for the United States’ Congress to gulp a large dose of some corporate-style innovation.

    [div class=attrib]From the Washington Post:[end-div]

    … Fiscal catastrophe has been around the corner, on and off, for 15 years. In that period, Dole and President Bill Clinton, a Democrat, came together to produce a record-breaking $230 billion surplus. That was later depleted by actions undertaken by both sides, bringing us to the tense situation we have today.

    What does this have to do with innovation?

    As the profession of innovation management matures, we are learning a few key things, including that constraints can be a good thing — and the “supercommittee” clock is a big constraint. Given this, what is the best strategy when you need to innovate in a hurry?

    When innovating under the gun, the first thing you must do is assemble a small, diverse team to own and attack the challenge. The “supercommittee” team is handicapped from the start, since it is neither small (think 4-5 people) nor diverse (neither in age nor expertise). Second, successful innovators envision what success looks like and pursue it single-mindedly – failure is not an option.

    Innovators also divide big challenges into smaller challenges that a small team can feel passionate about and assault on an even shorter timeline than the overall challenge. This requires that you put as much (or more) effort into determining the questions that form the challenges as you do into trying to solve them. Innovators ask big questions that challenge the status quo, such as “How could we generate revenue without taxes?” or “What spending could we avoid and how?” or “How would my son or my grandmother approach this?”

    To solve the challenges, successful innovators recruit people not only with expertise most relevant to the challenge, but also people with expertise in distant specialties, which, in innovation, is often where the best solutions come from.

    But probably most importantly, all nine innovation roles — the revolutionary, the conscript, the connector, the artist, customer champion, troubleshooter, judge, magic maker and evangelist — must be filled for an innovation effort to be successful.

    [div class=attrib]Read the entire article here.[end-div]

    What of the Millennials?

    The hippies of the sixties wanted love; the beatniks sought transcendence. Then came the punks, who were all about rage. The slackers and generation X stood for apathy and worry. And, now coming of age we have generation Y, also known as the “millennials”, whose birthdays fall roughly between 1982-2000.

    A fascinating article by William Deresiewicz, excerpted below, posits the millennials as a “post-emotional” generation. Interestingly, while this generation seems to be fragmented, its members are much more focused on their own “brand identity” than previous generations.

    [div class=attrib]From the New York Times:[end-div]

    EVER since I moved three years ago to Portland, Ore., that hotbed of all things hipster, I’ve been trying to get a handle on today’s youth culture. The style is easy enough to describe — the skinny pants, the retro hats, the wall-to-wall tattoos. But style is superficial. The question is, what’s underneath? What idea of life? What stance with respect to the world?

    So what’s the affect of today’s youth culture? Not just the hipsters, but the Millennial Generation as a whole, people born between the late ’70s and the mid-’90s, more or less — of whom the hipsters are a lot more representative than most of them care to admit. The thing that strikes me most about them is how nice they are: polite, pleasant, moderate, earnest, friendly. Rock ’n’ rollers once were snarling rebels or chest-beating egomaniacs. Now the presentation is low-key, self-deprecating, post-ironic, eco-friendly. When Vampire Weekend appeared on “The Colbert Report” last year to plug their album “Contra,” the host asked them, in view of the title, what they were against. “Closed-mindedness,” they said.

    According to one of my students at Yale, where I taught English in the last decade, a colleague of mine would tell his students that they belonged to a “post-emotional” generation. No anger, no edge, no ego.

    What is this about? A rejection of culture-war strife? A principled desire to live more lightly on the planet? A matter of how they were raised — everybody’s special and everybody’s point of view is valid and everybody’s feelings should be taken care of?

    Perhaps a bit of each, but mainly, I think, something else. The millennial affect is the affect of the salesman. Consider the other side of the equation, the Millennials’ characteristic social form. Here’s what I see around me, in the city and the culture: food carts, 20-somethings selling wallets made from recycled plastic bags, boutique pickle companies, techie start-ups, Kickstarter, urban-farming supply stores and bottled water that wants to save the planet.

    Today’s ideal social form is not the commune or the movement or even the individual creator as such; it’s the small business. Every artistic or moral aspiration — music, food, good works, what have you — is expressed in those terms.

    Call it Generation Sell.

    Bands are still bands, but now they’re little businesses, as well: self-produced, self-published, self-managed. When I hear from young people who want to get off the careerist treadmill and do something meaningful, they talk, most often, about opening a restaurant. Nonprofits are still hip, but students don’t dream about joining one, they dream about starting one. In any case, what’s really hip is social entrepreneurship — companies that try to make money responsibly, then give it all away.

    [div class=attrib]Read the entire article here.[end-div]

    [div class=attrib]Image: Millennial Momentum, Authors: Morley Winograd and Michael D. Hais, Rutgers University Press.[end-div]

    The Nation’s $360 Billion Medical Bill

    The United States spends around $2.5 trillion per year on health care. Approximately 14 percent of this is administrative spending. That’s $360 billion, yes, billion with a ‘b’, annually. And, by all accounts a significant proportion of this huge sum is duplicate, redundant, wasteful and unnecessary spending — that’s a lot of paperwork.

    [div class=attrib]From the New York Times:[end-div]

     

    LAST year I had to have a minor biopsy. Every time I went in for an appointment, I had to fill out a form requiring my name, address, insurance information, emergency contact person, vaccination history, previous surgical history and current medical problems, medications and allergies. I must have done it four times in just three days. Then, after my procedure, I received bills — and, even more annoying, statements of charges that said they weren’t bills — almost daily, from the hospital, the surgeon, the primary care doctor, the insurance company.

    Imagine that repeated millions of times daily and you have one of the biggest money wasters in our health care system. Administration accounts for roughly 14 percent of what the United States spends on health care, or about $360 billion per year. About half of all administrative costs — $163 billion in 2009 — are borne by Medicare, Medicaid and insurance companies. The other half pays for the legions employed by doctors and hospitals to fill out billing forms, keep records, apply for credentials and perform the myriad other administrative functions associated with health care.

    The range of expert opinions on how much of this could be saved goes as high as $180 billion, or half of current expenditures. But a more conservative and reasonable estimate comes from David Cutler, an economist at Harvard, who calculates that for the whole system — for insurers as well as doctors and hospitals — electronic billing and credentialing could save $32 billion a year. And United Health comes to a similar estimate, with 20 percent of savings going to the government, 50 percent to physicians and hospitals and 30 percent to insurers. For health care cuts to matter, they have to be above 1 percent of total costs, or $26 billion a year, and this conservative estimate certainly meets that threshold.

    How do we get to these savings? First, electronic health records would eliminate the need to fill out the same forms over and over. An electronic credentialing system shared by all hospitals, insurance companies, Medicare, Medicaid, state licensing boards and other government agencies, like the Drug Enforcement Administration, could reduce much of the paperwork doctors are responsible for that patients never see. Requiring all parties to use electronic health records and an online system for physician credentialing would reduce frustration and save billions.

    But the real savings is in billing. There are at least six steps in the process: 1) determining a patient’s eligibility for services; 2) obtaining prior authorization for specialist visits, tests and treatments; 3) submitting claims by doctors and hospitals to insurers; 4) verifying whether a claim was received and where in the process it is; 5) adjudicating denials of claims; and 6) receiving payment.

    Substantial costs arise from the fact that doctors, hospitals and other care providers must bill multiple insurance companies. Instead of having a unified electronic billing system in which a patient could simply swipe an A.T.M.-like card for automatic verification of eligibility, claims processing and payment, we have a complicated system with lots of expensive manual data entry that produces costly mistakes.

    [div class=attrib]Read more of this article here.[end-div]

    [div class=attrib]Image: Piles of paperwork. Courtesy of the Guardian.[end-div]

    Definition of Technocrat

    The unfolding financial crises and political upheavals in Europe have taken several casualties. Notably, the fall of both leaders and their governments in Greece and Italy. Both have been replaced by so-called “technocrats”. So, what is a technocrat and why? State explains.

    [div class=attrib]From Slate:[end-div]

    Lucas Papademos was sworn in as the new prime minister of Greece Friday morning. In Italy, it’s expected that Silvio Berlusconi will be replaced by former EU commissioner Mario Monti. Both men have been described as “technocrats” in major newspapers. What, exactly, is a technocrat?

    An expert, not a politician. Technocrats make decisions based on specialized information rather than public opinion. For this reason, they are sometimes called upon when there’s no popular or easy solution to a problem (like, for example, the European debt crisis). The word technocrat derives from the Greek tekhne, meaning skill or craft, and an expert in a field like economics can be as much a technocrat as one in a field more commonly thought to be technological (like robotics). Both Papademos and Monti hold advanced degrees in economics, and have each held appointments at government institutions.

    The word technocrat can also refer to an advocate of a form of government in which experts preside. The notion of a technocracy remains mostly hypothetical, though some nations have been considered as such in the sense of being governed primarily by technical experts. Historian Walter A. McDougall argued that the Soviet Union was the world’s first technocracy, and indeed its Politburo included an unusually high proportion of engineers. Other nations, including Italy and Greece, have undergone some short periods under technocratic regimes. Carlo Azeglio Ciampi, formerly an economist and central banker, served as prime minister of Italy from 1993 to 1994. Economist and former Bank of Greece director Xenophon Zolotas served as Prime Minister of Greece from 1989 to 1990.

    In the United States, technocracy was most popular in the early years of the Great Depression. Inspired in part by the ideas of economist Thorstein Veblen, the movement was led by engineer Howard Scott, who proposed radical utopian ideas and solutions to the economic disaster in scientific language. His movement, founded in 1932, drew national interest—the New York Times was the first major news organization to report the phenomenon, and Liberty Digest declared, “Technocracy is all the rage. All over the country it is being talked about, explained, wondered at, praised, damned. It is found about as easy to explain … as the Einstein theory of relativity.” A year later, it had mostly flamed out. No popular Technocratic party exists in the United States today, but Scott’s organization, called Technocracy Incorporated, persists in drastically reduced form.

    [div class=attrib]Read the entire article here.[end-div]

    [div class=attirb]Image: Mario Monti. Courtesy of Daily Telegraph.[end-div]

    Offshoring and Outsourcing of Innovation

    A fascinating article over at the Wall Street Journal contemplates the demise of innovation in the United States. It’s no surprise where it’s heading — China.

    [div class=attrib]From the Wall Street Journal:[end-div]

    At a recent business dinner, the conversation about intellectual-property theft in China was just getting juicy when an executive with a big U.S. tech company leaned forward and said confidently: “This isn’t such a problem for us because we plan on innovating new products faster than the Chinese can steal the old ones.”

    That’s a solution you often hear from U.S. companies: The U.S. will beat the Chinese at what the U.S. does best—innovation—because China’s bureaucratic, state-managed capitalism can’t master it.

    The problem is, history isn’t on the side of that argument, says Niall Ferguson, an economic historian whose new book, “Civilization: The West and the Rest,” was published this week. Mr. Ferguson, who teaches at Harvard Business School, says China and the rest of Asia have assimilated much of what made the West successful and are now often doing it better.

    “I’ve stopped believing that there’s some kind of cultural defect that makes the Chinese incapable of innovating,” he says. “They’re going to have the raw material of better educated kids that ultimately drives innovation.”

    Andrew Liveris, the chief executive of Dow Chemical, has pounded this drum for years, describing what he sees as a drift in engineering and manufacturing acumen from the West to Asia. “Innovation has followed manufacturing to China,” he told a group at the Wharton Business School recently.

    “Over time, when companies decide where to build R&D facilities, it will make more and more sense to do things like product support, upgrades and next-generation design in the same place where the product is made,” he said. “That is one reason why Dow has 500 Chinese scientists working in China, earning incredibly good money, and who are already generating more patents per scientist than our other locations.”

    For a statistical glimpse of this accretion at work, read the World Economic Forum’s latest annual competitiveness index, which ranks countries by a number of economic criteria. For the third year in a row, the U.S. has slipped and China has crept up. To be sure, the U.S. still ranks fifth in the world and China is a distant 26th, but the gap is slowly closing.

    [div class=attrib]Read the entire article here.[end-div]

    The Evils of Television

    Much has been written on the subject of television. Its effects on our culture in general and on the young minds of our children in particular have been studied and documented for decades. Increased levels of violence, the obesity epidemic, social fragmentation, vulgarity and voyeurism, caustic politics, poor attention span — all of these have been linked, at some time or other, to that little black box in the corner (increasingly, the big flat space above the mantle).

    In his article, A Nation of Vidiots, Jeffrey D. Sachs, weighs in on the subject.

    [div class=attrib]From Project Syndicate:[end-div]

    The past half-century has been the age of electronic mass media. Television has reshaped society in every corner of the world. Now an explosion of new media devices is joining the TV set: DVDs, computers, game boxes, smart phones, and more. A growing body of evidence suggests that this media proliferation has countless ill effects.

    The United States led the world into the television age, and the implications can be seen most directly in America’s long love affair with what Harlan Ellison memorably called “the glass teat.” In 1950, fewer than 8% of American households owned a TV; by 1960, 90% had one. That level of penetration took decades longer to achieve elsewhere, and the poorest countries are still not there.

    True to form, Americans became the greatest TV watchers, which is probably still true today, even though the data are somewhat sketchy and incomplete. The best evidence suggests that Americans watch more than five hours per day of television on average – a staggering amount, given that several hours more are spent in front of other video-streaming devices. Other countries log far fewer viewing hours. In Scandinavia, for example, time spent watching TV is roughly half the US average.

    The consequences for American society are profound, troubling, and a warning to the world – though it probably comes far too late to be heeded. First, heavy TV viewing brings little pleasure. Many surveys show that it is almost like an addiction, with a short-term benefit leading to long-term unhappiness and remorse. Such viewers say that they would prefer to watch less than they do.

    Moreover, heavy TV viewing has contributed to social fragmentation. Time that used to be spent together in the community is now spent alone in front of the screen. Robert Putnam, the leading scholar of America’s declining sense of community, has found that TV viewing is the central explanation of the decline of “social capital,” the trust that binds communities together. Americans simply trust each other less than they did a generation ago. Of course, many other factors are at work, but television-driven social atomization should not be understated.

    Certainly, heavy TV viewing is bad for one’s physical and mental health. Americans lead the world in obesity, with roughly two-thirds of the US population now overweight. Again, many factors underlie this, including a diet of cheap, unhealthy fried foods, but the sedentary time spent in front of the TV is an important influence as well.

    At the same time, what happens mentally is as important as what happens physically. Television and related media have been the greatest purveyors and conveyors of corporate and political propaganda in society.

    [div class=attrib]Read more of this article here.[end-div]

    [div class=attrib]Family watching television, c. 1958. Image courtesy of Wikipedia.[end-div]

    Texi as the Plural for Texas?

    Imagine more than one state of Texas. Or, imagine the division of Texas into a handful of sub-states smaller in size and perhaps more manageable. Frank Jacobs over at Strange Maps ponders a United States where there could be more than one Texas.

    [div class=attrib]From Strange Maps:[end-div]

    The plural of Texas? My money’s on Texases, even though that sounds almost as wrong as Texae, Texi or whatever alternative you might try to think up. Texas is defiantly singular. It is the Lone Star State, priding itself on its brief independence and distinct culture. Discounting Alaska, it is also the largest state in the Union.

    Texas is both a maverick and a behemoth, and as much an claimant to exceptionalism within the US as America itself is on the world stage. Texans are superlative Americans. When other countries reach for an American archetype to caricature (or to demonise), it’s often one they imagine having a Texan drawl: the greedy oil baron, the fundamentalist preacher, the trigger-happy cowboy (1).

    Texans will rightly object to being pigeonholed, but they probably won’t mind the implied reference to their tough-guy image. Nobody minds being provided with some room to swagger. See also the popularity of the slogan Don’t Mess With Texas, the state’s unofficial motto. It is less historical than it sounds, beginning life only in 1986 as the tagline of an anti-littering campaign.

    You’d have to be crazy to mess with a state that’s this big and fierce. In fact, you’d have to be Texas to mess with Texas. Really. That’s not just a clever put-down. It’s the law. When Texas joined the Union in 1845, voluntarily giving up its independence, it was granted the right by Congress to form “new States of convenient size, not exceeding four in number and in addition to the said State of Texas.”

    This would increase the total number of Texases to five, and enhance their political weight – at least in the US Senate, which would have to make room for 10 Senators from all five states combined, as opposed to just the twosome that represents the single state of Texas now.

    In 2009, the political blog FiveThirtyEight overlaid their plan on a county-level map of the Obama-McCain presidential election results (showing Texas to be overwhelmingly red, except for a band of blue along the Rio Grande). The five Texases are:

    • (New) Texas, comprising the Austin-San Antonio metropolitan area in central Texas;
    • Trinity, uniting Dallas, Fort Worth and Arlington;
    • Gulfland, along the coast and including Houston;
    • Plainland, from Lubbock all the way up the panhandle (with 40% of Texas’s territory, the largest successor state);
    • El Norte, south of the other states but north of Mexico, where most of the new state’s 85% Hispanics would have their roots.

    [div class=attrib]Read the entire article here.[end-div]

    A Better Way to Study and Learn

    Our current educational process in one sentence: assume student is empty vessel; provide student with content; reward student for remembering and regurgitating content; repeat.

    Yet, we have known for a while, and an increasing body of research corroborates our belief, that this method of teaching and learning is not very effective, or stimulating for that matter. It’s simply an efficient mechanism for the mass production of an adequate resource for the job market. Of course, for most it then takes many more decades following high school or college to unlearn the rote trivia and re-learn what is really important.

    Mind Hacks reviews some recent studies that highlight better approaches to studying.

    [div class=attrib]From Mind Hacks:[end-div]

    Decades old research into how memory works should have revolutionised University teaching. It didn’t.

    If you’re a student, what I’m about to tell you will let you change how you study so that it is more effective, more enjoyable and easier. If you work at a University, you – like me – should hang your head in shame that we’ve known this for decades but still teach the way we do.

    There’s a dangerous idea in education that students are receptacles, and teachers are responsible for providing content that fills them up. This model encourages us to test students by the amount of content they can regurgitate, to focus overly on statements rather than skills in assessment and on syllabuses rather than values in teaching. It also encourages us to believe that we should try and learn things by trying to remember them. Sounds plausible, perhaps, but there’s a problem. Research into the psychology of memory shows that intention to remember is a very minor factor in whether you remember something or not. Far more important than whether you want to remember something is how you think about the material when you encounter it.

    A classic experiment by Hyde and Jenkins (1973) illustrates this. These researchers gave participants lists of words, which they later tested recall of, as their memory items. To affect their thinking about the words, half the participants were told to rate the pleasentness of each word, and half were told to check if the word contained the letters ‘e’ or ‘g’. This manipulation was designed to affect ‘depth of processing’. The participants in the rating-pleasentness condition had to think about what the word meant, and relate it to themselves (how they felt about it) – “deep processing”. Participants in the letter-checking condition just had to look at the shape of the letters, they didn’t even have to read the word if they didn’t want to – “shallow processing”. The second, independent, manipulation concerned whether participants knew that they would be tested later on the words. Half of each group were told this – the “intentional learning” condition – and half weren’t told, the test would come as a surprise – the “incidental learning” condition.

    [div class=attrib]Read the entire article here.[end-div]

    [div class=attrib]Image courtesy of the Telegraph / AP.[end-div]

    The World Wide Web of Terrorism

    [div class=attrib]From Eurozine:[end-div]

    There are clear signs that Internet-radicalization was behind the terrorism of Anders Behring Breivik. Though most research on this points to jihadism, it can teach us a lot about how Internet-radicalization of all kinds can be fought.

    On 21 September 2010, Interpol released a press statement on their homepage warning against extremist websites. They pointed out that this is a global threat and that ever more terrorist groups use the Internet to radicalize young people.

    “Terrorist recruiters exploit the web to their full advantage as they target young, middle class vulnerable individuals who are usually not on the radar of law enforcement”, said Secretary General Ronald K. Noble. He continued: “The threat is global; it is virtual; and it is on our doorsteps. It is a global threat that only international police networks can fully address.”

    Noble pointed out that the Internet has made the radicalization process easier and the war on terror more difficult. Part of the reason, he claimed, is that much of what takes place is not really criminal.

    Much research has been done on Internet radicalization over the last few years but the emphasis has been on Islamist terror. The phenomenon can be summarized thus: young boys and men of Muslim background have, via the Internet, been exposed to propaganda, films from war zones, horrifying images of war in Afghanistan, Iraq and Chechnya, and also extreme interpretations of Islam. They are, so to speak, caught in the web, and some have resorted to terrorism, or at least planned it. The BBC documentary Generation Jihad gives an interesting and frightening insight into the phenomenon.

    Researchers Tim Stevens and Peter Neumann write in a report focused on Islamist Internet radicalization that Islamist groups are hardly unique in putting the Internet in the service of political extremism:
    Although Al Qaeda-inspired Islamist militants represented the most significant terrorist threat to the United Kingdom at the time of writing, Islamist militants are not the only – or even the predominant – group of political extremists engaged in radicalization and recruitment on the internet. Visitor numbers are notoriously difficult to verify, but some of the most popular Islamist militant web forums (for example, Al Ekhlaas, Al Hesbah, or Al Boraq) are easily rivalled in popularity by white supremacist websites such as Stormfront.

    Strikingly, Stormfront – an international Internet forum advocating “white nationalism” and dominated by neo-Nazis – is one of the websites visited by the terrorist Anders Behring Breivik, and a forum where he also left comments. In one place he writes about his hope that “the various fractured rightwing movements in Europe and the US reach a common consensus regarding the ‘Islamification of Europe/US’ can try and reach a consensus regarding the issue”. He continues: “After all, we all want the best for our people, and we owe it to them to try to create the most potent alliance which will have the strength to overthrow the governments which support multiculturalism.”

    [div class=attrib]Read more of this article here.[end-div]

    [div class=attrib]Image courtesy of Eurozine.[end-div]

    Corporations As People And the Threat to Truth

    In 2010 the U.S. Supreme Court ruled that corporations can be treated as people, assigning companies First Amendment rights under the Constitution. So, it’s probably only a matter of time before a real person legally marries (and divorces) a corporation. And, we’re probably not too far from a future where an American corporate CEO can take the life of competing company’s boss and “rightfully” declare that it was in competitive self-defense.

    In the meantime, the growing, and much needed, debate over corporate power, corporate responsibility and corporate consciousness rolls on. A timely opinion by Gary Gutting over at the New York Times, gives us more on which to chew.

    [div class=attrib]From the New York Times:[end-div]

    The Occupy Wall Street protest movement has raised serious questions about the role of capitalist institutions, particularly corporations, in our society.   Well before the first protester set foot in Zucotti Park, a heckler urged Mitt Romney to tax corporations rather than people.  Romney’s response — “Corporations are people” — stirred a brief but intense controversy.  Now thousands of demonstrators have in effect joined the heckler, denouncing corporations as ”enemies of the people.”

    Who’s right? Thinking pedantically, we can see ways in which Romney was literally correct; for example, corporations are nothing other than the people who own, run and work for them, and they are recognized as “persons” in some technical legal sense.  But it is also obvious that corporations are not people in a full moral sense: they cannot, for example, fall in love, write poetry or be depressed.

    Far more important than questions about what corporations are (ontological questions, as philosophers say) is the question of what attitude we should have toward them.  Should we, as corporate public relations statements often suggest, think of them as friends (if we buy and are satisfied with their products) or as family (if we work for them)?  Does it make sense to be loyal to a corporation as either a customer or as an employee?  More generally, even granted that corporations are not fully persons in the way that individuals are, do they have some important moral standing in our society?

    My answer to all these questions is no, because corporations have no core dedication to fundamental human values.  (To be clear, I am speaking primarily of large, for-profit, publicly owned corporations.)  Such corporations exist as instruments of profit for their shareholders.  This does not mean that they are inevitably evil or that they do not make essential economic contributions to society.  But it does mean that their moral and social value is entirely instrumental.   There are ways we can use corporations as means to achieve fundamental human values, but corporations do not of themselves work for these values. In fact, left to themselves, they can be serious threats to human values that conflict with the goal of corporate profit.

    Corporations are a particular threat to truth, a value essential in a democracy, which places a premium on the informed decisions of individual citizens.  The corporate threat is most apparent in advertising, which explicitly aims at convincing us to prefer a product regardless of its actual merit.

    [div class=attrib]Read more here.[end-div]

    [div class=attrib]Time Saving Truth from Falsehood and Envy by François Lemoyne. Image courtesy of Wikipedia / Wallace Collection, London.[end-div]

    How Many People Have Died?

    Ever wonder how many people have gone before? The succinct infographic courtesy of Jon Gosier takes a good stab at answering the question. First, a few assumptions and explanations:

    The numbers in this piece are speculative but are as accurate as modern research allows. It’s widely accepted that prior to 2002 there had been somewhere between 106 and 140 billion homo sapiens born to the world. The graphic below uses the conservative number (106 bn) as the basis for a circle graph. The center dot represents how many people are currently living (red) versus the dead (white). The dashed vertical line shows how much time passed between milestones. The spectral graph immediately below this text illustrates the population ‘benchmarks’ that were used to estimate the population over time. Adding the population numbers gets you to 106 billion. The red sphere is then used to compare against other data.

    [div class=attrib]Checkout the original here.[end-div]

    Mapping the Murder Rate

    A sad but nonetheless interesting infographic of murder rates throughout the world. The rates are per 100,000 of the population. The United States with a rate of 5 per 100,000 ranks close to Belarus, Peru and Thailand. Interestingly, it has a higher murder rate than Turkmenistan (4.4), Uzbekistan (3.1), Afghanistan (2.4) , Syria (3) and Iran (3).

    The top 5 countries with the highest murder rates are:

    Selflessness versus Selfishness: Either Extreme Can Be Bad

    [div class=attrib]From the New York Times:[end-div]

    Some years ago, Dr. Robert A. Burton was the neurologist on call at a San Francisco hospital when a high-profile colleague from the oncology department asked him to perform a spinal tap on an elderly patient with advanced metastatic cancer. The patient had seemed a little fuzzy-headed that morning, and the oncologist wanted to check for meningitis or another infection that might be treatable with antibiotics.

    Dr. Burton hesitated. Spinal taps are painful. The patient’s overall prognosis was beyond dire. Why go after an ancillary infection? But the oncologist, known for his uncompromising and aggressive approach to treatment, insisted.

    “For him, there was no such thing as excessive,” Dr. Burton said in a telephone interview. “For him, there was always hope.”

    On entering the patient’s room with spinal tap tray portentously agleam, Dr. Burton encountered the patient’s family members. They begged him not to proceed. The frail, bedridden patient begged him not to proceed. Dr. Burton conveyed their pleas to the oncologist, but the oncologist continued to lobby for a spinal tap, and the exhausted family finally gave in.

    As Dr. Burton had feared, the procedure proved painful and difficult to administer. It revealed nothing of diagnostic importance. And it left the patient with a grinding spinal-tap headache that lasted for days, until the man fell into a coma and died of his malignancy.

    Dr. Burton had admired his oncology colleague (now deceased), yet he also saw how the doctor’s zeal to heal could border on fanaticism, and how his determination to help his patients at all costs could perversely end up hurting them.

    The author of “On Being Certain” and the coming “A Skeptic’s Guide to the Mind,” Dr. Burton is a contributor to a scholarly yet surprisingly sprightly volume called “Pathological Altruism,” to be published this fall by Oxford University Press. And he says his colleague’s behavior is a good example of that catchily contradictory term, just beginning to make the rounds through the psychological sciences.

    As the new book makes clear, pathological altruism is not limited to showcase acts of self-sacrifice, like donating a kidney or a part of one’s liver to a total stranger. The book is the first comprehensive treatment of the idea that when ostensibly generous “how can I help you?” behavior is taken to extremes, misapplied or stridently rhapsodized, it can become unhelpful, unproductive and even destructive.

    Selflessness gone awry may play a role in a broad variety of disorders, including anorexia and animal hoarding, women who put up with abusive partners and men who abide alcoholic ones.

    [div class=attrib]Read more here.[end-div]

    [div class=attrib]Image courtesy of Serge Bloch, New York Times.[end-div]

    A Commencement Address for Each of Us: Stay Hungry. Stay Foolish.

    Much has been written to honor the life of Steve Jobs, who passed away October 5, 2011 at the young age of 56. Much more will be written. To honor his vision and passion we re-print below a rare public speech given Steve Jobs at the Stanford University Commencement on June 12, 2005. The address is a very personal and thoughtful story of innovation, love and loss, and death.

    [div class=attrib]Courtesy of Stanford University:[end-div]

    I am honored to be with you today at your commencement from one of the finest universities in the world. I never graduated from college. Truth be told, this is the closest I’ve ever gotten to a college graduation. Today I want to tell you three stories from my life. That’s it. No big deal. Just three stories.

    The first story is about connecting the dots.

    I dropped out of Reed College after the first 6 months, but then stayed around as a drop-in for another 18 months or so before I really quit. So why did I drop out?

    It started before I was born. My biological mother was a young, unwed college graduate student, and she decided to put me up for adoption. She felt very strongly that I should be adopted by college graduates, so everything was all set for me to be adopted at birth by a lawyer and his wife. Except that when I popped out they decided at the last minute that they really wanted a girl. So my parents, who were on a waiting list, got a call in the middle of the night asking: “We have an unexpected baby boy; do you want him?” They said: “Of course.” My biological mother later found out that my mother had never graduated from college and that my father had never graduated from high school. She refused to sign the final adoption papers. She only relented a few months later when my parents promised that I would someday go to college.

    And 17 years later I did go to college. But I naively chose a college that was almost as expensive as Stanford, and all of my working-class parents’ savings were being spent on my college tuition. After six months, I couldn’t see the value in it. I had no idea what I wanted to do with my life and no idea how college was going to help me figure it out. And here I was spending all of the money my parents had saved their entire life. So I decided to drop out and trust that it would all work out OK. It was pretty scary at the time, but looking back it was one of the best decisions I ever made. The minute I dropped out I could stop taking the required classes that didn’t interest me, and begin dropping in on the ones that looked interesting.

    It wasn’t all romantic. I didn’t have a dorm room, so I slept on the floor in friends’ rooms, I returned coke bottles for the 5¢ deposits to buy food with, and I would walk the 7 miles across town every Sunday night to get one good meal a week at the Hare Krishna temple. I loved it. And much of what I stumbled into by following my curiosity and intuition turned out to be priceless later on. Let me give you one example:

    Reed College at that time offered perhaps the best calligraphy instruction in the country. Throughout the campus every poster, every label on every drawer, was beautifully hand calligraphed. Because I had dropped out and didn’t have to take the normal classes, I decided to take a calligraphy class to learn how to do this. I learned about serif and san serif typefaces, about varying the amount of space between different letter combinations, about what makes great typography great. It was beautiful, historical, artistically subtle in a way that science can’t capture, and I found it fascinating.

    None of this had even a hope of any practical application in my life. But ten years later, when we were designing the first Macintosh computer, it all came back to me. And we designed it all into the Mac. It was the first computer with beautiful typography. If I had never dropped in on that single course in college, the Mac would have never had multiple typefaces or proportionally spaced fonts. And since Windows just copied the Mac, it’s likely that no personal computer would have them. If I had never dropped out, I would have never dropped in on this calligraphy class, and personal computers might not have the wonderful typography that they do. Of course it was impossible to connect the dots looking forward when I was in college. But it was very, very clear looking backwards ten years later.

    Again, you can’t connect the dots looking forward; you can only connect them looking backwards. So you have to trust that the dots will somehow connect in your future. You have to trust in something — your gut, destiny, life, karma, whatever. This approach has never let me down, and it has made all the difference in my life.

    My second story is about love and loss.

    I was lucky — I found what I loved to do early in life. Woz and I started Apple in my parents garage when I was 20. We worked hard, and in 10 years Apple had grown from just the two of us in a garage into a $2 billion company with over 4000 employees. We had just released our finest creation — the Macintosh — a year earlier, and I had just turned 30. And then I got fired. How can you get fired from a company you started? Well, as Apple grew we hired someone who I thought was very talented to run the company with me, and for the first year or so things went well. But then our visions of the future began to diverge and eventually we had a falling out. When we did, our Board of Directors sided with him. So at 30 I was out. And very publicly out. What had been the focus of my entire adult life was gone, and it was devastating.

    I really didn’t know what to do for a few months. I felt that I had let the previous generation of entrepreneurs down – that I had dropped the baton as it was being passed to me. I met with David Packard and Bob Noyce and tried to apologize for screwing up so badly. I was a very public failure, and I even thought about running away from the valley. But something slowly began to dawn on me — I still loved what I did. The turn of events at Apple had not changed that one bit. I had been rejected, but I was still in love. And so I decided to start over.

    I didn’t see it then, but it turned out that getting fired from Apple was the best thing that could have ever happened to me. The heaviness of being successful was replaced by the lightness of being a beginner again, less sure about everything. It freed me to enter one of the most creative periods of my life.

    During the next five years, I started a company named NeXT, another company named Pixar, and fell in love with an amazing woman who would become my wife. Pixar went on to create the worlds first computer animated feature film, Toy Story, and is now the most successful animation studio in the world. In a remarkable turn of events, Apple bought NeXT, I returned to Apple, and the technology we developed at NeXT is at the heart of Apple’s current renaissance. And Laurene and I have a wonderful family together.

    I’m pretty sure none of this would have happened if I hadn’t been fired from Apple. It was awful tasting medicine, but I guess the patient needed it. Sometimes life hits you in the head with a brick. Don’t lose faith. I’m convinced that the only thing that kept me going was that I loved what I did. You’ve got to find what you love. And that is as true for your work as it is for your lovers. Your work is going to fill a large part of your life, and the only way to be truly satisfied is to do what you believe is great work. And the only way to do great work is to love what you do. If you haven’t found it yet, keep looking. Don’t settle. As with all matters of the heart, you’ll know when you find it. And, like any great relationship, it just gets better and better as the years roll on. So keep looking until you find it. Don’t settle.

    My third story is about death.

    When I was 17, I read a quote that went something like: “If you live each day as if it was your last, someday you’ll most certainly be right.” It made an impression on me, and since then, for the past 33 years, I have looked in the mirror every morning and asked myself: “If today were the last day of my life, would I want to do what I am about to do today?” And whenever the answer has been “No” for too many days in a row, I know I need to change something.

    Remembering that I’ll be dead soon is the most important tool I’ve ever encountered to help me make the big choices in life. Because almost everything — all external expectations, all pride, all fear of embarrassment or failure – these things just fall away in the face of death, leaving only what is truly important. Remembering that you are going to die is the best way I know to avoid the trap of thinking you have something to lose. You are already naked. There is no reason not to follow your heart.

    About a year ago I was diagnosed with cancer. I had a scan at 7:30 in the morning, and it clearly showed a tumor on my pancreas. I didn’t even know what a pancreas was. The doctors told me this was almost certainly a type of cancer that is incurable, and that I should expect to live no longer than three to six months. My doctor advised me to go home and get my affairs in order, which is doctor’s code for prepare to die. It means to try to tell your kids everything you thought you’d have the next 10 years to tell them in just a few months. It means to make sure everything is buttoned up so that it will be as easy as possible for your family. It means to say your goodbyes.

    I lived with that diagnosis all day. Later that evening I had a biopsy, where they stuck an endoscope down my throat, through my stomach and into my intestines, put a needle into my pancreas and got a few cells from the tumor. I was sedated, but my wife, who was there, told me that when they viewed the cells under a microscope the doctors started crying because it turned out to be a very rare form of pancreatic cancer that is curable with surgery. I had the surgery and I’m fine now.

    This was the closest I’ve been to facing death, and I hope it’s the closest I get for a few more decades. Having lived through it, I can now say this to you with a bit more certainty than when death was a useful but purely intellectual concept:

    No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because Death is very likely the single best invention of Life. It is Life’s change agent. It clears out the old to make way for the new. Right now the new is you, but someday not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it is quite true.

    Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma — which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice. And most important, have the courage to follow your heart and intuition. They somehow already know what you truly want to become. Everything else is secondary.

    When I was young, there was an amazing publication called The Whole Earth Catalog, which was one of the bibles of my generation. It was created by a fellow named Stewart Brand not far from here in Menlo Park, and he brought it to life with his poetic touch. This was in the late 1960’s, before personal computers and desktop publishing, so it was all made with typewriters, scissors, and polaroid cameras. It was sort of like Google in paperback form, 35 years before Google came along: it was idealistic, and overflowing with neat tools and great notions.

    Stewart and his team put out several issues of The Whole Earth Catalog, and then when it had run its course, they put out a final issue. It was the mid-1970s, and I was your age. On the back cover of their final issue was a photograph of an early morning country road, the kind you might find yourself hitchhiking on if you were so adventurous. Beneath it were the words: “Stay Hungry. Stay Foolish.” It was their farewell message as they signed off. Stay Hungry. Stay Foolish. And I have always wished that for myself. And now, as you graduate to begin anew, I wish that for you.

    Stay Hungry. Stay Foolish.

    Thank you all very much.

    Steve Jobs: The Secular Prophet

    The world will miss Steve Jobs.

    In early 2010 the U.S. Supreme Court overturned years of legal precedent by assigning First Amendment (free speech) protections to corporations. We could argue the merits and demerits of this staggering ruling until the cows come home. However, one thing is clear if corporations are to be judged as people. And, that is the world would in all likelihood benefit more from a corporation with a human, optimistic and passionate face (Apple) rather than from a faceless one (Exxon) or an ideological one (News Corp) or an opaque one (Koch Industries).

    That said, we excerpt a fascinating essay on Steve Jobs by Andy Crouch below. We would encourage Mr.Crouch to take this worthy idea further by examining the Fortune 1000 list of corporations. Could he deliver a similar analysis for each of these corporations’ leaders? We believe not.

    The world will miss Steve Jobs.

    [div class=attrib]By Andy Crouch for the Wall Street Journal:[end-div]

    Steve Jobs was extraordinary in countless ways—as a designer, an innovator, a (demanding and occasionally ruthless) leader. But his most singular quality was his ability to articulate a perfectly secular form of hope. Nothing exemplifies that ability more than Apple’s early logo, which slapped a rainbow on the very archetype of human fallenness and failure—the bitten fruit—and turned it into a sign of promise and progress.

    That bitten apple was just one of Steve Jobs’s many touches of genius, capturing the promise of technology in a single glance. The philosopher Albert Borgmann has observed that technology promises to relieve us of the burden of being merely human, of being finite creatures in a harsh and unyielding world. The biblical story of the Fall pronounced a curse upon human work—”cursed is the ground for thy sake; in sorrow shalt thou eat of it all the days of thy life.” All technology implicitly promises to reverse the curse, easing the burden of creaturely existence. And technology is most celebrated when it is most invisible—when the machinery is completely hidden, combining godlike effortlessness with blissful ignorance about the mechanisms that deliver our disburdened lives.

    Steve Jobs was the evangelist of this particular kind of progress—and he was the perfect evangelist because he had no competing source of hope. He believed so sincerely in the “magical, revolutionary” promise of Apple precisely because he believed in no higher power. In his celebrated Stanford commencement address (which is itself an elegant, excellent model of the genre), he spoke frankly about his initial cancer diagnosis in 2003. It’s worth pondering what Jobs did, and didn’t, say:

    “No one wants to die. Even people who want to go to heaven don’t want to die to get there. And yet death is the destination we all share. No one has ever escaped it. And that is as it should be, because death is very likely the single best invention of life. It’s life’s change agent; it clears out the old to make way for the new. Right now, the new is you. But someday, not too long from now, you will gradually become the old and be cleared away. Sorry to be so dramatic, but it’s quite true. Your time is limited, so don’t waste it living someone else’s life. Don’t be trapped by dogma, which is living with the results of other people’s thinking. Don’t let the noise of others’ opinions drown out your own inner voice, heart and intuition. They somehow already know what you truly want to become.”

    This is the gospel of a secular age.

    [div class=attrib]Steve Jobs by Tim O’Brien, image courtesy of Wall Street Journal.[end-div]

    Misconceptions of Violence

    We live in violent times. Or do we?

    Despite the seemingly constant flow of human engineered destruction on our fellow humans, other species and our precious environment some thoughtful analysis — beyond the headlines of cable news — shows that all may not be lost to our violent nature. An insightful interview with psychologist Steven Pinker, author of “How the Mind Works” shows us that contemporary humans are not as bad as we may have thought. His latest book, “The Better Angels of Our Nature: Why Violence Has Declined,” analyzes the basis and history of human violence. Perhaps surprisingly Pinker suggests that we live in remarkably peaceful times, comparatively speaking. Characteristically he backs up his claims with clear historical evidence.

    [div class=attrib]From Gareth Cook for Mind Matters:[end-div]

    COOK: What would you say is the biggest misconception people have about violence?
    PINKER: That we are living in a violent age. The statistics suggest that this may be the most peaceable time in our species’s existence.

    COOK: Can you give a sense for how violent life was 500 or 1000 years ago?
    PINKER: Statistics aside, accounts of daily life in medieval and early modern Europe reveal a society soaked in blood and gore. Medieval knights—whom today we would call warlords—fought their numerous private wars with a single strategy: kill as many of the opposing knight’s peasants as possible. Religious instruction included prurient descriptions of how the saints of both sexes were tortured and mutilated in ingenious ways. Corpses broken on the wheel, hanging from gibbets, or rotting in iron cages where the sinner had been left to die of exposure and starvation were a common part of the landscape. For entertainment, one could nail a cat to a post and try to head-butt it to death, or watch a political prisoner get drawn and quartered, which is to say partly strangled, disemboweled, and castrated before being decapitated. So many people had their noses cut off in private disputes that medical textbooks had procedures that were alleged to grow them back.

    COOK: How has neuroscience contributed to our understanding of violence and its origins?
    PINKER: Neuroscientists have long known that aggression in animals is not a unitary phenomenon driven by a single hormone or center. When they stimulate one part of the brain of a cat, it will lunge for the experimenter in a hissing, fangs-out rage; when they stimulate another, it will silently stalk a hallucinatory mouse. Still another circuit primes a male cat for a hostile confrontation with another male. Similar systems for rage, predatory seeking, and male-male aggression may be found in Homo sapiens, together with uniquely human, cognitively-driven  systems of aggression such as political and religious ideologies and moralistic punishment. Today, even the uniquely human systems can be investigated using functional neuroimaging. So neuroscience has given us the crucial starting point in understanding violence, namely that it is not a single thing. And it has helped us to discover biologically realistic taxonomies of the major motives for violence.

    COOK: Is the general trend toward less violence going to continue in the future?
    PINKER: It depends. In the arena of custom and institutional practices, it’s a good bet. I suspect that violence against women, the criminalization of homosexuality, the use of capital punishment, the callous treatment of animals on farms, corporal punishment of children, and other violent social practices will continue to decline, based on the fact that worldwide moralistic shaming movements in the past (such as those against slavery, whaling, piracy, and punitive torture) have been effective over long stretches of time. I also don’t expect war between developed countries to make a comeback any time soon. But civil wars, terrorist acts, government repression, and genocides in backward parts of the world are simply too capricious to allow predictions. With six billion people in the world, there’s no predicting what some cunning fanatic or narcissistic despot might do.

    [div class=attrib]Read more of the interview here.[end-div]

    [div class=attrib]Image courtesy of Scientific American.[end-div]

    All Power Corrupts

    [div class=attrib]From the Economist:[end-div]

    DURING the second world war a new term of abuse entered the English language. To call someone “a little Hitler” meant he was a menial functionary who employed what power he had in order to annoy and frustrate others for his own gratification. From nightclub bouncers to the squaddies at Abu Ghraib prison who tormented their prisoners for fun, little Hitlers plague the world. The phenomenon has not, though, hitherto been subject to scientific investigation.

    Nathanael Fast of the University of Southern California has changed that. He observed that lots of psychological experiments have been done on the effects of status and lots on the effects of power. But few, if any, have been done on both combined. He and his colleagues Nir Halevy of Stanford University and Adam Galinsky of Northwestern University, in Chicago, set out to correct this. In particular they wanted to see if it is circumstances that create little Hitlers or, rather, whether people of that type simply gravitate into jobs which allow them to behave badly. Their results have just been published in the Journal of Experimental Social Psychology.

    Dr Fast’s experiment randomly assigned each of 213 participants to one of four situations that manipulated their status and power. All participants were informed that they were taking part in a study on virtual organisations and would be interacting with, but not meeting, a fellow student who worked in the same fictional consulting firm. Participants were then assigned either the role of “idea producer”, a job that entailed generating and working with important ideas, or of “worker”, a job that involved menial tasks like checking for typos. A post-experiment questionnaire demonstrated that participants did, as might be expected, look upon the role of idea producer with respect and admiration. Equally unsurprisingly, they looked down on the role of worker.

    Participants who had both status and power did not greatly demean their partners. They chose an average of 0.67 demeaning activities for those partners to perform. Low-power/low-status and low-power/high-status participants behaved similarly. They chose, on average, 0.67 and 0.85 demeaning activities. However, participants who were low in status but high in power—the classic “little Hitler” combination—chose an average of 1.12 deeply demeaning tasks for their partners to engage in. That was a highly statistically significant distinction.

    Of course, not everybody in the high-power/low-status quadrant of the experiment behaved badly. Underlying personality may still have a role. But as with previous experiments in which random members of the public have been asked to play prison guard or interrogator, Dr Fast’s result suggests that many quite ordinary people will succumb to bad behaviour if the circumstances are right.

    [div class=attrib]Read more here.[end-div]

    [div class=attrib]Image courtesy of the Economist / Getty Images.[end-div]

    The Cult of the Super Person

    It is undeniable that there is ever increasing societal pressure on children to perform compete, achieve and succeed, and to do so at ever younger ages. However, while average college test admission scores have improved it’s also arguable that admission standards have dropped. So, the picture painted by James Atlas in the article below is far from clear. Nonetheless, it’s disturbing that our children get less and less time to dream, play, explore and get dirty.

    [div class=attrib]From the New York Times:[end-div]

    A BROCHURE arrives in the mail announcing this year’s winners of a prestigious fellowship to study abroad. The recipients are allotted a full page each, with a photo and a thick paragraph chronicling their achievements. It’s a select group to begin with, but even so, there doesn’t seem to be anyone on this list who hasn’t mastered at least one musical instrument; helped build a school or hospital in some foreign land; excelled at a sport; attained fluency in two or more languages; had both a major and a minor, sometimes two, usually in unrelated fields (philosophy and molecular science, mathematics and medieval literature); and yet found time — how do they have any? — to enjoy such arduous hobbies as mountain biking and white-water kayaking.

    Let’s call this species Super Person.

    Do we have some anomalous cohort here? Achievement freaks on a scale we haven’t seen before? Has our hysterically competitive, education-obsessed society finally outdone itself in its tireless efforts to produce winners whose abilities are literally off the charts? And if so, what convergence of historical, social and economic forces has been responsible for the emergence of this new type? Why does Super Person appear among us now?

    Perhaps there’s an evolutionary cause, and these robust intellects reflect the leap in the physical development of humans that we ascribe to better diets, exercise and other forms of health-consciousness. (Stephen Jay Gould called this mechanism “extended scope.”) All you have to do is watch a long rally between Novak Djokovic and Rafael Nadal to recognize — if you’re old enough — how much faster the sport has become over the last half century.

    The Super Person training for the college application wars is the academic version of the Super Person slugging it out on the tennis court. For wonks, Harvard Yard is Arthur Ashe Stadium.

    Preparing for Super Personhood begins early. “We see kids who’ve been training from an early age,” says Charles Bardes, chairman of admissions at Weill Cornell Medical College. “The bar has been set higher. You have to be at the top of the pile.”

    And to clamber up there you need a head start. Thus the well-documented phenomenon of helicopter parents. In her influential book “Perfect Madness: Motherhood in the Age of Anxiety,” Judith Warner quotes a mom who gave up her career to be a full-time parent: “The children are the center of the household and everything goes around them. You want to do everything and be everything for them because this is your job now.” Bursting with pent-up energy, the mothers transfer their shelved career ambitions to their children. Since that book was published in 2005, the situation has only intensified. “One of my daughter’s classmates has a pilot’s license; 12-year-olds are taking calculus,” Ms. Warner said last week.

    [div class=attrib]Read more of this article here.[end-div]

    [div class=attrib]Image courtesy of Mark Todd. New York Times.[end-div]

    Is Our Children Learning: Testing the Standardized Tests

    Test grades once measured student performance. Nowadays test grades are used to measure teacher and parent, educational institution and even national performance. Gary Cutting over at the Stone forum has some instructive commentary.

    [div class=attrib]From the New York Times:[end-div]

    So what exactly do test scores tell us?

    Poor test scores are the initial premises in most current arguments for educational reform.  At the end of last year, reading scores that showed American 15-year-olds in the middle of an international pack, led by Asian countries, prompted calls from researchers and educators for immediate action.  This year two sociologists, Richard Arum and Josipa Roksa, showed that 45 percent of students, after two years of college, have made no significant gains on a test of critical thinking.  Last week’s report of falling SAT scores is the latest example.

    Given poor test results, many critics conclude that our schools are failing and propose plans for immediate action.  For example, when Arum and Raksa published their results, many concluded that college teachers need to raise standards in their courses, requiring more hours of study and assigning longer papers.

    It is, however, not immediately obvious what follows from poor test scores.  Without taking any position about the state of our schools or how, if at all, they need reform, I want to reflect on what we need to add to the fact of poor scores to construct an argument for changing the way we educate.

    The first question is whether a test actually tests for things that we want students to know.   We very seldom simply want students to do well on a test for its own sake.

    [div class=attrib]Read more of this article here.[end-div]

    [div class=attrib]Image courtesy of U.S. College Search.[end-div]

    Map Your Favorite Red (Wine)

    This season’s Beaujolais Nouveau is just over a month away so what better way to pave the road to French wines than a viticultural map. The wine map is based on the 1930’s iconic design by Harry Beck of the London Tube (subway).

    [div class=attrib]From Frank Jacobs at Strange Maps:[end-div]

    The coloured lines on this wine map denote the main wine-producing regions in France, the dots are significant cities or towns in those regions. Names that branch off from the main line via little streaks are the so-called appellations [2].

    This schematic approach is illuminating for non-aficionados. In the first place, it clarifies the relation between region and appellation. For example: Médoc, Margaux and St-Emilion are three wines from the same region. So they are all Bordeaux wines, but each with its own appellation.

    Secondly, it provides a good indication of the geographic relation between appellations within regions. Chablis and Nuits-St-Georges are northern Burgundy wines, while Beaujolais is a southern one. It also permits some comparison between regions: Beaujolais, although a Burgundy, neighbours Côte Rôtie, a northern Rhône Valley wine.

    And lastly, it provides the names of the main grape varieties used in each region (the white ones italicised), like merlot or chardonnay.

    Which Couch, the Blue or White? Stubbornness and Social Pressure

    Counterintuitive results show that we are more likely to resist changing our minds when more people tell us where are wrong. A team of researchers from HP’s Social Computing Research Group found that humans are more likely to change their minds when fewer, rather than more, people disagree with them.

    [div class=attrib]From HP:[end-div]

    The research has practical applications for businesses, especially in marketing, suggests co-author Bernardo Huberman,  Senior HP Fellow and director of HP’s Social Computing Research Group.

    “What this implies,” he says, “is that rather than overwhelming consumers with strident messages about an alternative product or service, in social media, gentle reporting of a few people having chosen that product or service can be more persuasive.”

    The experiment – devised by Huberman along with Haiyi Zhu, an HP labs summer intern from Carnegie Mellon University, and Yarun Luon of HP Labs – reveals several other factors that determine whether choices can be reversed though social influence, too. It’s the latest product of HP Lab’s pioneering program in social computing, which is dedicated to creating software and algorithms that provide meaningful context to huge sets of unstructured data.

    Study results: the power of opinion
    Opinions and product ratings are everywhere online. But when do they actually influence our own choices?

    To find out, the HP team asked several hundred people to make a series of choices between two different pieces of furniture.  After varying amounts of time, they were asked to choose again between the same items, but this time they were told that a certain number of other people had preferred the opposite item.  (Separately, the experiment also asked subjects to choose between two different baby pictures, to control for variance in subject matter).

    Analysis of the resulting choices showed that receiving a small amount of social pressure to reverse one’s opinion (by being told that a just few people had chosen differently) was more likely to produce a reversed vote than when the pressure felt was much greater (i.e. where an overwhelming number of people were shown as having made a different choice).

    The team also discovered:

    – People were more likely to be influenced if they weren’t prompted to change their mind immediately after they had expressed their original preference.
    – The more time that people spent on their choice, the more likely they were to reverse that choice and conform to the opinion of others later on.

    [div class=attrib]More of this fascinating article here.[end-div]

    Complex Decision To Make? Go With the Gut

    Over the last couple of years a number of researchers have upended conventional wisdom by finding that complex decisions, for instance, those having lots of variables, are better “made” through our emotional system. This flies in the face of the commonly held belief that complexity is best handled by our rational side.

    [div class=attrib]Jonah Lehrer over at the Frontal Cortex brings us up to date on current thinking.[end-div]

    We live in a world filled with difficult decisions. In fact, we’ve managed to turn even trivial choices – say, picking a toothpaste – into a tortured mental task, as the typical supermarket has more than 200 different dental cleaning options. Should I choose a toothpaste based on fluoride content? Do I need a whitener in my toothpaste? Is Crest different than Colgate? The end result is that the banal selection becomes cognitively demanding, as I have to assess dozens of alternatives and take an array of variables into account. And it’s not just toothpaste: The same thing has happened to nearly every consumption decision, from bottled water to blue jeans to stocks. There are no simple choices left – capitalism makes everything complicated.

    How should we make all these hard choices? How does one navigate a world of seemingly infinite alternatives? For thousands of years, the answer has seemed obvious: when faced with a difficult dilemma, we should carefully assess our options and spend a few moments consciously deliberating the information. Then, we should choose the toothpaste that best fits our preferences. This is how we maximize utility and get the most bang for the buck. We are rational agents – we should make decisions in a rational manner.

    But what if rationality backfires? What if we make better decisions when we trust our gut instincts? While there is an extensive literature on the potential wisdom of human emotion, it’s only in the last few years that researchers have demonstrated that the emotional system (aka Type 1 thinking) might excel at complex decisions, or those involving lots of variables. If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be “smarter” than reasoned deliberation. This is largely because the unconscious is able to handle a surfeit of information, digesting the facts without getting overwhelmed. (Human reason, in contrast, has a very strict bottleneck and can only process about four bits of data at any given moment.) When confused in the toothpaste aisle, bewildered by all the different options, we should go with the product that feels the best.

    The most widely cited demonstration of this theory is a 2006 Science paper led by Ap Dijksterhuis. (I wrote about the research in How We Decide.) The experiment went like this: Dijksterhuis got together a group of Dutch car shoppers and gave them descriptions of four different used cars. Each of the cars was rated in four different categories, for a total of sixteen pieces of information. Car number 1, for example, was described as getting good mileage, but had a shoddy transmission and poor sound system. Car number 2 handled poorly, but had lots of legroom. Dijksterhuis designed the experiment so that one car was objectively ideal, with “predominantly positive aspects”. After showing people these car ratings, Dijksterhuis then gave them a few minutes to consciously contemplate their decision. In this “easy” situation, more than fifty percent of the subjects ended up choosing the best car.

    [div class=attrib]Read more of the article and Ap Dijksterhuis’ classic experiment here.[end-div]

    [div class=attrib]Image courtesy of CustomerSpeak.[end-div]

    Movies in the Mind: A Great Leap in Brain Imaging

    A common premise of “mad scientists” in science fiction movies: a computer reconstructs video images from someone’s thoughts via a brain scanning device. Yet, now this is no longer the realm of fantasy. Researchers from the University of California at Berkeley have successfully decoded and reconstructed people’s dynamic visual experiences – in this case watching Hollywood movie trailers –using functional Magnetic Resonance Imaging (fMRI) and computer simulation models.

    Watch the stunning video clip below showing side-by-side movies of what a volunteer was actually watching and a computer reconstruction of fMRI data from the same volunteer.

    [youtube]nsjDnYxJ0bo[/youtube]

    The results are a rudimentary first step, with the technology requiring decades of refinement before the fiction of movies, such as Brainstorm, becomes a closer reality. However, this groundbreaking research nonetheless paves the way to a future of tremendous promise in brain science. Imagine the ability to reproduce and share images of our dreams and memories, or peering into the brain of a comatose patient.

    [div class=attrib]More from the UC-Berkeley article here.[end-div]

    How Will You Die?

    Bad news and good news. First, the bad news. If you’re between 45-54 years of age your cause of death will most likely be heart disease, that is, if you’re a male. If you are a female on the other hand, you’re more likely to fall prey to cancer. And, interestingly you are about 5 times more likely to die falling down stairs than from (accidental) electrocution. Now the good news. While the data may give us a probabilistic notion of how we may perish, no one (yet) knows when.

    More vital statistics courtesy of this macabre infographic derived from data of National Center for Health Statistics and the National Safety Council.

    Chance as a Subjective or Objective Measure

    [div class=attrib]From Rationally Speaking:[end-div]

    Stop me if you’ve heard this before: suppose I flip a coin, right now. I am not giving you any other information. What odds (or probability, if you prefer) do you assign that it will come up heads?

    If you would happily say “Even” or “1 to 1” or “Fifty-fifty” or “probability 50%” — and you’re clear on WHY you would say this — then this post is not aimed at you, although it may pleasantly confirm your preexisting opinions as a Bayesian on probability. Bayesians, broadly, consider probability to be a measure of their state of knowledge about some proposition, so that different people with different knowledge may correctly quote different probabilities for the same proposition.

    If you would say something along the lines of “The question is meaningless; probability only has meaning as the many-trials limit of frequency in a random experiment,” or perhaps “50%, but only given that a fair coin and fair flipping procedure is being used,” this post is aimed at you. I intend to try to talk you out of your Frequentist view; the view that probability exists out there and is an objective property of certain physical systems, which we humans, merely fallibly, measure.

    My broader aim is therefore to argue that “chance” is always and everywhere subjective — a result of the limitations of minds — rather than objective in the sense of actually existing in the outside world.

    [div class=attrib]Much more of this article here.[end-div]

    [div class=attrib]Image courtesy of Wikipedia.[end-div]

    Eurovision

    If you grew up in Europe or have spent at least 6 months there over the last 50 years you’ll have collided with the Eurovision Song Contest.

    A quintessentially european invention, Eurovision, as it is commonly know, has grown from a handful of countries to embrace 43 nations across Europe in 2012. Countries compete for the prize of best song and the honor of hosting the contest the following year. While contestants and song are not usually guaranteed long-standing commercial success, the winner usually does claim 15 minutes or so on the spotlight and at least a singular one-hit-wonder. A notable exceptions was the Swedish group ABBA, which went on to generation-spanning superstardom.

    Frank Jacobs over at Strange Maps offers his cartographic take on Eurovision.

    [div class=attrib]From Strange Maps / Big Think:[end-div]

    The Eurovision Song Contest is a resounding success in at least one respect. Set up as a laboratory of European harmony – musically, audiovisually and politically – its first edition [1] featured a mere 7 participating countries, all Western European. The 57th edition, next May in Azerbaijan, will have 43 countries from all over the continent vying for the top prize, and the honour to host the 2013 edition of the event in their capital city.

    Mission accomplished, then. But a chorus of critics – swelling, as the turn of phrase suggests [2] – finds the annual event increasingly tacky and irrelevant. The winner is determined by a tally of national votes, which have less to do with the quality of the songs than with the degree of friendliness between the participating countries.

    [div class=attrib]More of the article here.[end-div]