Tag Archives: decision making

Monsters of Our Own Making

For parents: a few brief tips on how to deal with young adult children — that most pampered of generations. Tip number 1: turn off junior’s access to the family Netflix account.

From WSJ:

Congratulations. Two months ago, your kid graduated from college, bravely finishing his degree rather than dropping out to make millions on his idea for a dating app for people who throw up during Cross Fit training. If he’s like a great many of his peers, he’s moved back home, where he’s figuring out how to become an adult in the same room that still has his orthodontic headgear strapped to an Iron Man helmet.

Now we’re deep into summer, and the logistical challenges of your grad really being home are sinking in. You’re constantly juggling cars, cleaning more dishes and dealing with your daughter’s boyfriend, who not only slept over but also drank your last can of Pure Protein Frosty Chocolate shake.

But the real challenge here is a problem of your own making. You see, these children are members of the Most-Loved Generation: They’ve grown up with their lives stage-managed by us, their college-acceptance-obsessed parents. Remember when Eva, at age 7, was obsessed with gymnastics…for exactly 10 months, which is why the TV in your guest room sits on top of a $2,500 pommel horse?

Now that they’re out of college, you realize what wasn’t included in that $240,000 education: classes in life skills and decision-making.

With your kid at home, you find that he’s incapable of making a single choice on his own. Like when you’re working and he interrupts to ask how many blades is the best number for a multi-blade razor. Or when you’ve just crawled into bed and hear the familiar refrain of, “Mom, what can we eat?” All those years being your kid’s concierge and coach have created a monster.

So the time has come for you to cut the cord. And by that I mean: Take your kid off your Netflix account. He will be confused and upset at first, not understanding why this is happening to him, but it’s a great opportunity for him to sign up for something all by himself.

Which brings us to money. It’s finally time to channel your Angela Merkel and get tough with your young Alexis Tsipras. Put him on a consistent allowance and make him pay the extra fees incurred when he uses the ATM at the weird little deli rather than the one at his bank, a half-block away.

Next, nudge your kid to read books about self-motivation. Begin with baby steps: Don’t just hand her “Lean In” and “I Am Malala.” Your daughter’s great, but she’s no Malala. And the only thing she’s leaning in to is a bag of kettle corn while binge-watching “Orange Is the New Black.”

Instead, over dinner, casually drop a few pearls of wisdom from “Coach Wooden’s Pyramid of Success,” such as, “Make each day your masterpiece.” Let your kid decide whether getting a high score on her “Panda Pop Bubble Shooter” iPhone game qualifies. Then hope that John Wooden has piqued her curiosity and leave his book out with a packet of Sour Patch Xploderz on top. With luck, she’ll take the bait (candy and book).

Now it’s time to work on your kid’s inability to make a decision, which, let’s be honest, you’ve instilled over the years by jumping to answer all of her texts, even that time you were at the opera. “But,” you object, “it could have been an emergency!” It wasn’t. She couldn’t remember whether she liked Dijon mustard or mayo on her turkey wrap.

Set up some outings that nurture independence. Send your kid to the grocery store with orders to buy a week of dinner supplies. She’ll ask a hundred questions about what to get, but just respond with, “Whatever looks good to you” or, “Have fun with it.” She will look at you with panic, but don’t lose your resolve. Send her out and turn your phone off to avoid a barrage of texts, such as, “They’re out of bacterial wipes to clean off the shopping cart handle. What should I do?”

Rest assured, in a couple of hours, she’ll return with “dinner”—frozen waffles and a bag of Skinny Pop popcorn. Tough it out and serve it for dinner: The name of the game is positive reinforcement.

Once she’s back you’ll inevitably get hit with more questions, like, “It’s not lost, but how expensive is that remote key for the car?” Take a deep breath and just say, “Um, I’m not sure. Why don’t you Google it?”

Read the entire story here.

We Are All Always Right, All of the Time

You already know this: you believe that your opinion is correct all the time, about everything. And, interestingly enough, your friends and neighbors believe that they are always right too. Oh, and the colleague at the office with whom you argue all the time — she’s right all the time too.

How can this be, when in an increasingly science-driven, objective universe facts trump opinion? Well, not so fast. It seems that we humans have an internal mechanism that colors our views based on a need for acceptance within a broader group. That is, we generally tend to spin our rational views in favor of group consensus, versus supporting the views of a subject matter expert, which might polarize the group. This is both good and bad. Good because it reinforces the broader benefits of being within a group; bad because we are more likely to reject opinion, evidence and fact from experts outside of our group — think climate change.

From the Washington Post:

It’s both the coolest — and also in some ways the most depressing — psychology study ever.

Indeed, it’s so cool (and so depressing) that the name of its chief finding — the Dunning-Kruger effect — has at least halfway filtered into public consciousness. In the classic 1999 paper, Cornell researchers David Dunning and Justin Kruger found that the less competent people were in three domains — humor, logic, and grammar — the less likely they were to be able to recognize that. Or as the researchers put it:

We propose that those with limited knowledge in a domain suffer from a dual burden: Not only do they reach mistaken conclusions and make regrettable errors, but their incompetence robs them of the ability to realize it.

Dunning and Kruger didn’t directly apply this insight to our debates about science. But I would argue that the effect named after them certainly helps to explain phenomena like vaccine denial, in which medical authorities have voiced a very strong opinion, but some parents just keep on thinking that, somehow, they’re in a position to challenge or ignore this view.

So why do I bring this classic study up now?

The reason is that an important successor to the Dunning-Kruger paper has just been come out — and it, too, is pretty depressing (at least for those of us who believe that domain expertise is a thing to be respected and, indeed, treasured)This time around, psychologists have not uncovered an endless spiral of incompetence and the inability to perceive it. Rather, they’ve shown that people have an “equality bias” when it comes to competence or expertise, such that even when it’s very clear that one person in a group is more skilled, expert, or competent (and the other less), they are nonetheless inclined to seek out a middle ground in determining how correct different viewpoints are.

Yes, that’s right — we’re all right, nobody’s wrong, and nobody gets hurt feelings.

The new study, just published in the Proceedings of the National Academy of Sciences, is by Ali Mahmoodi of the University of Tehran and a long list of colleagues from universities in the UK, Germany, China, Denmark, and the United States. And no wonder: The research was transnational, and the same experiment — with the same basic results — was carried out across cultures in China, Denmark, and Iran.

Read the entire story here.

Business Decison-Making Welcomes Science

data-visualization-ayasdi

It is likely that business will never eliminate gut instinct from the decision-making process. However, as data, now big data, increasingly pervades every crevice of every organization, the use of data-driven decisions will become the norm. As this happens, more and more businesses find themselves employing data scientists to help filter, categorize, mine and analyze these mountains of data in meaningful ways.

The caveat, of course, is that data, big data and an even bigger reliance on that data requires subject matter expertise and analysts with critical thinking skills and sound judgement — data cannot be used blindly.

From Technology review:

Throughout history, innovations in instrumentation—the microscope, the telescope, and the cyclotron—have repeatedly revolutionized science by improving scientists’ ability to measure the natural world. Now, with human behavior increasingly reliant on digital platforms like the Web and mobile apps, technology is effectively “instrumenting” the social world as well. The resulting deluge of data has revolutionary implications not only for social science but also for business decision making.

As enthusiasm for “big data” grows, skeptics warn that overreliance on data has pitfalls. Data may be biased and is almost always incomplete. It can lead decision makers to ignore information that is harder to obtain, or make them feel more certain than they should. The risk is that in managing what we have measured, we miss what really matters—as Vietnam-era Secretary of Defense Robert McNamara did in relying too much on his infamous body count, and as bankers did prior to the 2007–2009 financial crisis in relying too much on flawed quantitative models.

The skeptics are right that uncritical reliance on data alone can be problematic. But so is overreliance on intuition or ideology. For every Robert McNamara, there is a Ron Johnson, the CEO whose disastrous tenure as the head of JC Penney was characterized by his dismissing data and evidence in favor of instincts. For every flawed statistical model, there is a flawed ideology whose inflexibility leads to disastrous results.

So if data is unreliable and so is intuition, what is a responsible decision maker supposed to do? While there is no correct answer to this question—the world is too complicated for any one recipe to apply—I believe that leaders across a wide range of contexts could benefit from a scientific mind-set toward decision making.

A scientific mind-set takes as its inspiration the scientific method, which at its core is a recipe for learning about the world in a systematic, replicable way: start with some general question based on your experience; form a hypothesis that would resolve the puzzle and that also generates a testable prediction; gather data to test your prediction; and finally, evaluate your hypothesis relative to competing hypotheses.

The scientific method is largely responsible for the astonishing increase in our understanding of the natural world over the past few centuries. Yet it has been slow to enter the worlds of politics, business, policy, and marketing, where our prodigious intuition for human behavior can always generate explanations for why people do what they do or how to make them do something different. Because these explanations are so plausible, our natural tendency is to want to act on them without further ado. But if we have learned one thing from science, it is that the most plausible explanation is not necessarily correct. Adopting a scientific approach to decision making requires us to test our hypotheses with data.

While data is essential for scientific decision making, theory, intuition, and imagination remain important as well—to generate hypotheses in the first place, to devise creative tests of the hypotheses that we have, and to interpret the data that we collect. Data and theory, in other words, are the yin and yang of the scientific method—theory frames the right questions, while data answers the questions that have been asked. Emphasizing either at the expense of the other can lead to serious mistakes.

Also important is experimentation, which doesn’t mean “trying new things” or “being creative” but quite specifically the use of controlled experiments to tease out causal effects. In business, most of what we observe is correlation—we do X and Y happens—but often what we want to know is whether or not X caused Y. How many additional units of your new product did your advertising campaign cause consumers to buy? Will expanded health insurance coverage cause medical costs to increase or decline? Simply observing the outcome of a particular choice does not answer causal questions like these: we need to observe the difference between choices.

Replicating the conditions of a controlled experiment is often difficult or impossible in business or policy settings, but increasingly it is being done in “field experiments,” where treatments are randomly assigned to different individuals or communities. For example, MIT’s Poverty Action Lab has conducted over 400 field experiments to better understand aid delivery, while economists have used such experiments to measure the impact of online advertising.

Although field experiments are not an invention of the Internet era—randomized trials have been the gold standard of medical research for decades—digital technology has made them far easier to implement. Thus, as companies like Facebook, Google, Microsoft, and Amazon increasingly reap performance benefits from data science and experimentation, scientific decision making will become more pervasive.

Nevertheless, there are limits to how scientific decision makers can be. Unlike scientists, who have the luxury of withholding judgment until sufficient evidence has accumulated, policy makers or business leaders generally have to act in a state of partial ignorance. Strategic calls have to be made, policies implemented, reward or blame assigned. No matter how rigorously one tries to base one’s decisions on evidence, some guesswork will be required.

Exacerbating this problem is that many of the most consequential decisions offer only one opportunity to succeed. One cannot go to war with half of Iraq and not the other just to see which policy works out better. Likewise, one cannot reorganize the company in several different ways and then choose the best. The result is that we may never know which good plans failed and which bad plans worked.

Read the entire article here.

Image: Screenshot of Iris, Ayasdi’s data-visualization tool. Courtesy of Ayasdi / Wired.

Wrong Decisions, Bad Statistics

Each of us makes countless decisions daily. A not insignificant number of these — each day — is probably wrong. And, in most cases we continue, recover, readjust, move on, and sometimes even correct ourselves and learn. In the majority of instances these wrong decisions lead to inconsequential results.

However, sometimes the results are much more tragic, leading to accidents, injury and death. When those incorrect decisions are made by healthcare professionals the consequences are much more stark. By some estimates, around 50,000 hospital deaths could be prevented each year in Canada and the U.S. from misdiagnosis.

From the New York Times:

Six years ago I was struck down with a mystery illness. My weight dropped by 30 pounds in three months. I experienced searing stomach pain, felt utterly exhausted and no matter how much I ate, I couldn’t gain an ounce.

I went from slim to thin to emaciated. The pain got worse, a white heat in my belly that made me double up unexpectedly in public and in private. Delivering on my academic and professional commitments became increasingly challenging.

It was terrifying. I did not know whether I had an illness that would kill me or stay with me for the rest of my life or whether what was wrong with me was something that could be cured if I could just find out what on earth it was.

Trying to find the answer, I saw doctors in London, New York, Minnesota and Chicago.

I was offered a vast range of potential diagnoses. Cancer was quickly and thankfully ruled out. But many other possibilities remained on the table, from autoimmune diseases to rare viruses to spinal conditions to debilitating neural illnesses.

Treatments suggested ranged from a five-hour, high-risk surgery to remove a portion of my stomach, to lumbar spine injections to numb nerve paths, to a prescription of antidepressants.

Faced with all these confusing and conflicting opinions, I had to work out which expert to trust, whom to believe and whose advice to follow. As an economist specializing in the global economy, international trade and debt, I have spent most of my career helping others make big decisions — prime ministers, presidents and chief executives — and so I’m all too aware of the risks and dangers of poor choices in the public as well as the private sphere. But up until then I hadn’t thought much about the process of decision making. So in between M.R.I.’s, CT scans and spinal taps, I dove into the academic literature on decision making. Not just in my field but also in neuroscience, psychology, sociology, information science, political science and history.

What did I learn?

Physicians do get things wrong, remarkably often. Studies have shown that up to one in five patients are misdiagnosed. In the United States and Canada it is estimated that 50,000 hospital deaths each year could have been prevented if the real cause of illness had been correctly identified.

Yet people are loath to challenge experts. In a 2009 experiment carried out at Emory University, a group of adults was asked to make a decision while contemplating an expert’s claims, in this case, a financial expert. A functional M.R.I. scanner gauged their brain activity as they did so. The results were extraordinary: when confronted with the expert, it was as if the independent decision-making parts of many subjects’ brains pretty much switched off. They simply ceded their power to decide to the expert.

If we are to control our own destinies, we have to switch our brains back on and come to our medical consultations with plenty of research done, able to use the relevant jargon. If we can’t do this ourselves we need to identify someone in our social or family network who can do so on our behalf.

Anxiety, stress and fear — emotions that are part and parcel of serious illness — can distort our choices. Stress makes us prone to tunnel vision, less likely to take in the information we need. Anxiety makes us more risk-averse than we would be regularly and more deferential.

We need to know how we are feeling. Mindfully acknowledging our feelings serves as an “emotional thermostat” that recalibrates our decision making. It’s not that we can’t be anxious, it’s that we need to acknowledge to ourselves that we are.

It is also crucial to ask probing questions not only of the experts but of ourselves. This is because we bring into our decision-making process flaws and errors of our own. All of us show bias when it comes to what information we take in. We typically focus on anything that agrees with the outcome we want.

Read the entire article here.

Procrastination is a Good Thing

Procrastinators have known this for a long time: that success comes from making a decision at the last possible moment.

Procrastinating professor Frank Partnoy expands on this theory, captured in his book, “Wait: The Art and Science of Delay“.

[div class=attrib]From Smithsonian:[end-div]

Sometimes life seems to happen at warp speed. But, decisions, says Frank Partnoy, should not. When the financial market crashed in 2008, the former investment banker and corporate lawyer, now a professor of finance and law and co-director of the Center for Corporate and Securities Law at the University of San Diego, turned his attention to literature on decision-making.

“Much recent research about decisions helps us understand what we should do or how we should do it, but it says little about when,” he says.

In his new book, Wait: The Art and Science of Delay, Partnoy claims that when faced with a decision, we should assess how long we have to make it, and then wait until the last possible moment to do so. Should we take his advice on how to “manage delay,” we will live happier lives.

It is not surprising that the author of a book titled Wait is a self-described procrastinator. In what ways do you procrastinate?

I procrastinate in just about every possible way and always have, since my earliest memories going back to when I first starting going to elementary school and had these arguments with my mother about making my bed.

My mom would ask me to make my bed before going to school. I would say, no, because I didn’t see the point of making my bed if I was just going to sleep in it again that night. She would say, well, we have guests coming over at 6 o’clock, and they might come upstairs and look at your room. I said, I would make my bed when we know they are here. I want to see a car in the driveway. I want to hear a knock on the door. I know it will take me about one minute to make my bed so at 5:59, if they are here, I will make my bed.

I procrastinated all through college and law school. When I went to work at Morgan Stanley, I was delighted to find that although the pace of the trading floor is frenetic and people are very fast, there were lots of incredibly successful mentors of procrastination.

Now, I am an academic. As an academic, procrastination is practically a job requirement. If I were to say I would be submitting an academic paper by September 1, and I submitted it in August, people would question my character.

It has certainly been drilled into us that procrastination is a bad thing. Yet, you argue that we should embrace it. Why?

Historically, for human beings, procrastination has not been regarded as a bad thing. The Greeks and Romans generally regarded procrastination very highly. The wisest leaders embraced procrastination and would basically sit around and think and not do anything unless they absolutely had to.

The idea that procrastination is bad really started in the Puritanical era with Jonathan Edwards’s sermon against procrastination and then the American embrace of “a stitch in time saves nine,” and this sort of work ethic that required immediate and diligent action.

But if you look at recent studies, managing delay is an important tool for human beings. People are more successful and happier when they manage delay. Procrastination is just a universal state of being for humans. We will always have more things to do than we can possibly do, so we will always be imposing some sort of unwarranted delay on some tasks. The question is not whether we are procrastinating, it is whether we are procrastinating well.

[div class=attrib]Read the entire article after the jump.[end-div]

[div class=attrib]Image courtesy of eHow.[end-div]

Complex Decision To Make? Go With the Gut

Over the last couple of years a number of researchers have upended conventional wisdom by finding that complex decisions, for instance, those having lots of variables, are better “made” through our emotional system. This flies in the face of the commonly held belief that complexity is best handled by our rational side.

[div class=attrib]Jonah Lehrer over at the Frontal Cortex brings us up to date on current thinking.[end-div]

We live in a world filled with difficult decisions. In fact, we’ve managed to turn even trivial choices – say, picking a toothpaste – into a tortured mental task, as the typical supermarket has more than 200 different dental cleaning options. Should I choose a toothpaste based on fluoride content? Do I need a whitener in my toothpaste? Is Crest different than Colgate? The end result is that the banal selection becomes cognitively demanding, as I have to assess dozens of alternatives and take an array of variables into account. And it’s not just toothpaste: The same thing has happened to nearly every consumption decision, from bottled water to blue jeans to stocks. There are no simple choices left – capitalism makes everything complicated.

How should we make all these hard choices? How does one navigate a world of seemingly infinite alternatives? For thousands of years, the answer has seemed obvious: when faced with a difficult dilemma, we should carefully assess our options and spend a few moments consciously deliberating the information. Then, we should choose the toothpaste that best fits our preferences. This is how we maximize utility and get the most bang for the buck. We are rational agents – we should make decisions in a rational manner.

But what if rationality backfires? What if we make better decisions when we trust our gut instincts? While there is an extensive literature on the potential wisdom of human emotion, it’s only in the last few years that researchers have demonstrated that the emotional system (aka Type 1 thinking) might excel at complex decisions, or those involving lots of variables. If true, this would suggest that the unconscious is better suited for difficult cognitive tasks than the conscious brain, that the very thought process we’ve long disregarded as irrational and impulsive might actually be “smarter” than reasoned deliberation. This is largely because the unconscious is able to handle a surfeit of information, digesting the facts without getting overwhelmed. (Human reason, in contrast, has a very strict bottleneck and can only process about four bits of data at any given moment.) When confused in the toothpaste aisle, bewildered by all the different options, we should go with the product that feels the best.

The most widely cited demonstration of this theory is a 2006 Science paper led by Ap Dijksterhuis. (I wrote about the research in How We Decide.) The experiment went like this: Dijksterhuis got together a group of Dutch car shoppers and gave them descriptions of four different used cars. Each of the cars was rated in four different categories, for a total of sixteen pieces of information. Car number 1, for example, was described as getting good mileage, but had a shoddy transmission and poor sound system. Car number 2 handled poorly, but had lots of legroom. Dijksterhuis designed the experiment so that one car was objectively ideal, with “predominantly positive aspects”. After showing people these car ratings, Dijksterhuis then gave them a few minutes to consciously contemplate their decision. In this “easy” situation, more than fifty percent of the subjects ended up choosing the best car.

[div class=attrib]Read more of the article and Ap Dijksterhuis’ classic experiment here.[end-div]

[div class=attrib]Image courtesy of CustomerSpeak.[end-div]