Brilliant book with one clear message: our emotional brain is faster and usually smarter than our logical brain. Our emotions are trained by years of logic and experience, retaining it all for real wisdom. Many decisions are better made by going with the gut feeling. Gets a little too technical with deep brain/neuro/cortex talk, but brings it back to usable points.
The orbitofrontal cortex (OFC) is responsible for integrating visceral emotions into the decision-making process. It connects the feelings generated by the "primitive" brain-areas like the brain stem and the amygdala, which is in the limbic system-to the stream of conscious thought.
When a person is drawn to something, the mind is trying to tell him that he should choose that option. It has already assessed the alternatives - this analysis takes place outside of conscious awareness - and converted that assessment into a positive emotion. And when he sees an option he doesn't like, it is the OFC that makes him want to get away.
The world is full of things, and it is our feelings that help us choose among them.
Emotion and motivation share the same Latin root, movere, which means "to move."
Feelings are often an accurate shortcut, a concise expression of decades' worth of experience. They already know how to do it.
The human brain is like a computer operating system that was rushed to market. This is why a cheap calculator can do arithmetic better than a professional mathematician, why a mainframe computer can beat a grand master at chess, and why we so often confuse causation and correlation. When it comes to the new parts of the brain, evolution just hasn't had time to work out the kinks. The emotional brain, however, has been exquisitely refined by evolution over the last several hundred million years. Its software code has been subjected to endless tests, so it can make fast decisions based on very little information.
The process of thinking requires feeling, for feelings are what let us understand all the information that we can't directly comprehend. Reason without emotion is impotent.
After the ACC receives input from a dopamine neuron, spindle cells use their cellular velocity - they transmit electrical signals faster than any other neuron - to make sure that the rest of the cortex is instantly saturated in that specific feeling.
Human emotions are rooted in the predictions of highly flexible brain cells, which are constantly adjusting their connections to reflect reality. Every time you make a mistake or encounter something new, your brain cells are busy changing themselves. Our emotions are deeply empirical.
Dopamine neurons automatically detect the subtle patterns that we would otherwise fail to notice; they assimilate all the data that we can't consciously comprehend. And then, once they come up with a set of refined predictions about how the world works, they translate these predictions into emotions.
These wise yet inexplicable feelings are an essential part of the decision-making process. Even when we think we know nothing, our brains know something. That's what our feelings are trying to tell us.
This doesn't mean that people can coast on these cellular emotions. Dopamine neurons need to be continually trained and retrained, or else their predictive accuracy declines. Trusting one's emotions requires constant vigilance; intelligent intuition is the result of deliberate practice.
For Bill Robertie, his success has a simple explanation: "I know how to practice. I know how to make myself better."
Robertie bought a book on backgammon strategy, memorized a few opening moves, and then started to play. And play. And play. "You've got to get obsessed. You've got to reach the point where you're having dreams about the game. I could just glance at a board and know what I should do. The game started to become very much a matter of aesthetics. My decisions increasingly depended on the look of things, so that I could contemplate a move and then see right away if it made my position look better or worse."
It's not the quantity of practice, it's the quality. The most effective way to get better is to focus on your mistakes. In other words, you need to consciously consider the errors being internalized.
Searching for his errors, dissecting those decisions that could have been a little bit better. He knows that self-criticism is the secret to self-improvement; negative feedback is the best kind.
An expert is a person who has made all the mistakes that can be made in a very narrow field.
The experience of failure had been so discouraging for the "smart" kids that they actually regressed.
The problem with praising kids for their innate intelligence - the "smart" compliment - is that it misrepresents the neural reality of education. It encourages kids to avoid the most useful kind of learning activities, which is learning from mistakes.
Unless you experience the unpleasant symptoms of being wrong, your brain will never revise its models.
After Herb Stein finishes shooting a soap opera episode, he immediately goes home and reviews the rough cut. "I watch the whole thing, and I just take notes. I'm looking really hard for my mistakes. I pretty much always want to find thirty mistakes, thirty things that I could have done better. If I can't find thirty, then I'm not looking hard enough."
But emotions aren't perfect. They are a crucial cognitive tool, but even the most useful tools can't solve every problem. In fact, there are certain conditions that consistently short-circuit the emotional brain, causing people to make bad decisions. The best decision-makers know which situations require less intuitive responses.
Scientists found there was absolutely no evidence of the hot hand. A player's chance of making a shot was not affected by whether or not his previous shots had gone in. Each field-goal attempt was its own independent event.
The stock market is a classic example of a random system. The past movement of any particular stock cannot be used to predict its future movement.
Fama looked at decades of stock-market data in order to prove that no amount of knowledge or rational analysis could help anyone figure out what would happen next. All of the esoteric tools used by investors to make sense of the market were pure nonsense. Wall Street was like a slot machine.The danger of the stock market, however, is that sometimes its erratic fluctuations can actually look predictable, at least in the short term.
Our brains completely misinterpret what's actually going on. We trust our feelings and perceive patterns, but the patterns don't actually exist.
These computational signals are also a main cause of financial bubbles. When the market keeps going up, people are led to make larger and larger investments in the boom. Their greedy brains are convinced that they've solved the stock market, and so they don't think about the possibility of losses.
You get the exact opposite effect when the market heads down. People just can't wait to get out, because the brain doesn't want to regret staying in. At this point, the brain realizes that it's made some very expensive prediction errors, and the investor races to dump any assets that are declining in value.
It's silly to try to beat the market with your brain.
Since the market is a random walk with an upward slope, the best solution is to pick a low-cost index fund and wait. Patiently. Don't fixate on what might have been or obsess over someone else's profits. The investor who does nothing to his stock portfolio - who doesn't buy or sell a single stock - outperforms the average "active" investor by nearly 10 percent. Wall Street has always searched for the secret algorithm of financial success, but the secret is, there is no secret.
Over the long term, stock portfolios always generated higher returns than bond portfolios. In fact, stocks typically earned more than seven times as much as bonds. MaCurdy and Shoven concluded that people who invest in bonds must be "confused about the relative safety of different investments over long horizons."
The key to solving the premium equity puzzle was loss aversion. Investors buy bonds because they hate losing money, and bonds are a safe bet. Instead of making financial decisions that reflect all the relevant statistical information, they depend on their emotional instincts and seek the certain safety of bonds.
Loss aversion also explains one of the most common investing mistakes: investors evaluating their stock portfolios are most likely to sell stocks that have increased in value. Unfortunately, this means that they end up holding on to their depreciating stocks. Over the long term, this strategy is exceedingly foolish.
Even professional money managers are vulnerable to this bias and tend to hold losing stocks twice as long as winning stocks. Why does an investor do this? Because he is afraid to take a loss - it feels bad - and selling shares that have decreased in value makes the loss tangible. We try to postpone the pain for as long as possible; the result is more losses.
Loss aversion is an innate flaw. Everyone who experiences emotion is vulnerable to its effects. It's part of a larger psychological phenomenon known as negativity bias, which means that, for the human mind, bad is stronger than good.
Asymmetric paternalism. That's a fancy name for a simple idea: creating policies and incentives that help people triumph over their irrational impulses and make better, more prudent decisions.
People who are more rational don't perceive emotion less, they just regulate it better.
How do we regulate our emotions? The answer is surprisingly simple: by thinking about them.
As soon as people have the insight, they say it just seems obviously correct. They know instantly that they've solved the problem.
When you encounter a problem you've never experienced before, when your dopamine neurons have no idea what to do, it's essential that you try to tune out your feelings. Pilots call such a state "deliberate calm," because staying calm in high-pressure situations requires conscious effort.
Performance choking is actually triggered by a specific mental mistake: thinking too much.
Novice putters hit better shots when they consciously reflect on their actions.
The more time the beginner spends thinking about the putt, the more likely he is to sink the ball in the hole. By concentrating on the golf game, by paying attention to the mechanics of the stroke, the novice can avoid beginners' mistakes. A little experience, however, changes everything. After a golfer has learned how to putt - once he or she has memorized the necessary movements - analyzing the stroke is a waste of time. The brain already knows what to do.
(After black students did worse on IQ tests:) When Steele gave a separate group of students the same test but stressed that it was not a measure of intelligence - he told them it was merely a preparatory drill - the scores of the white and black students were virtually identical. The achievement gap had been closed.
When the rational brain hijacks the mind, people tend to make all sorts of decision-making mistakes. They hit bad golf shots and choose wrong answers on standardized tests. They ignore the wisdom of their emotions - the knowledge embedded in their dopamine neurons - and start reaching for things that they can explain. One of the problems with feelings is that even when they are accurate, they can still be hard to articulate. Instead of going with the option that feels the best, a person starts going with the option that sounds the best, even if it's a very bad idea.
He asked them to explain why they preferred one brand over another. As they tasted the jams, the students filled out written questionnaires, which forced them to analyze their first impressions, to consciously explain their impulsive preferences. All this extra analysis seriously warped their jam judgment. The students now preferred the worst-tasting jam, according to Consumer Reports.
Thinking too much about strawberry jam causes us to focus on all sorts of variables that don't actually matter. Instead of just listening to our instinctive preferences - the best jam is associated with the most positive feelings - our rational brains search for reasons to prefer one jam over another.
There is such a thing as too much analysis. When you overthink at the wrong moment, you cut yourself off from the wisdom of your emotions, which are much better at assessing actual preferences. You lose the ability to know what you really want.
The more people thought about which posters they wanted, the more misleading their thoughts became. Self-analysis resulted in less self-awareness.
Deliberative homeowners focused on less important details like square footage and number of bathrooms. It's easier to consider quantifiable facts than future emotions, such as how you'll feel when you're stuck in a rush-hour traffic jam. The prospective homeowners assumed a bigger house in the suburbs would make them happy, even if it meant spending an extra hour in the car every day.
The placebo effect depended entirely on the prefrontal cortex, the center of reflective, deliberate thought. When people were told that they'd just received pain-relieving cream, their frontal lobes responded by inhibiting the activity of their emotional brain areas (like the insula) that normally respond to pain. Because people expected to experience less pain, they ended up experiencing less pain.
People who'd paid discounted prices for "brain power" drinks consistently solved about 30 percent fewer puzzles than the people who'd paid full price for the drinks. The subjects were convinced that the stuff on sale was much less potent, even though all the drinks were identical.
The effort required to memorize seven digits drew cognitive resources away from the part of the brain that normally controls emotional urges.
A slight drop in blood-sugar levels can also inhibit self-control, since the frontal lobes require lots of energy in order to function.
Students who were given the drink without real sugar were significantly more likely to rely on instinct and intuition when choosing a place to live, even if that led them to choose the wrong places. The reason, according to Baumeister, is that the rational brains of these students were simply too exhausted to think. They'd needed a restorative sugar fix, and all they'd gotten was Splenda. This research can also help explain why we get cranky when we're hungry and tired: the brain is less able to suppress the negative emotions sparked by small annoyances.
Each of the students select a portfolio of stock investments. Then he divided the students into two groups. The first group could see only the changes in the prices of their stocks. They had no idea why the share prices rose or fell and had to make their trading decisions based on an extremely limited amount of data. In contrast, the second group was given access to a steady stream of financial information. They could watch CNBC, read the Wall Street journal, and consult experts for the latest analysis of market trends. So which group did better? To Andreassen's surprise, the group with less information ended up earning more than twice as much as the well-informed group. Being exposed to extra news was distracting, and the high-information students quickly became focused on the latest rumors and insider gossip. (Herbert Simon said it best: "A wealth of information creates a poverty of attention.") As a result of all the extra input, these students engaged in far more buying and selling than the low-information group. They were convinced that all their knowledge allowed them to anticipate the market. But they were wrong.
People almost always assume that more information is better. Modern corporations are especially beholden to this idea and spend a fortune trying to create "analytic workspaces" that "maximize the informational potential of their decision-makers." These managerial cliches, plucked from the sales brochures of companies such as Oracle and Unisys, are predicated on the assumptions that executives perform better when they have access to more facts and figures and that bad decisions are a result of ignorance. But it's important to know the limitations of this approach, which are rooted in the limitations of the brain.
When a person gives their brain too many facts and then tries to make a decision based on the facts that seem important, that person is asking for trouble. He is going to buy the wrong items at Wal-Mart and pick the wrong stocks.
A group of researchers imaged the spinal regions of 98 people who had no back pain or back-related problems. The pictures were then sent to doctors who didn't know that the patients weren't in pain. The result was shocking: the doctors reported that two-thirds of these normal patients exhibited "serious problems" such as bulging, protruding, or herniated discs. In 38 percent of these patients, the MRI revealed multiple damaged discs. Nearly 90 percent of these patients exhibited some form of "disc degeneration." These structural abnormalities are often used to justify surgery.
This is the danger of too much information: it can actually interfere with understanding. When the prefrontal cortex is overwhelmed, a person can no longer make sense of the situation. Correlation is confused with causation, and people make theories out of coincidences.
When you see a painting, you usually know instantly and automatically whether you like it. If someone asks you to explain your judgment, you confabulate. Moral arguments are much the same: Two people feel strongly about an issue, their feelings come first, and their reasons are invented on the fly, to throw at each other.
When you are confronted with an ethical dilemma, the unconscious automatically generates an emotional reaction. (This is what psychopaths can't do.) Within a few milliseconds, the brain has made up its mind; you know what is right and what is wrong. These moral instincts aren't rational - they've never heard of Kant - but they are an essential part of what keep us all from committing unspeakable crimes. It's only at this point - after the emotions have already made the moral decision - that those rational circuits in the prefrontal cortex are activated. People come up with persuasive reasons to justify their moral intuition. When it comes to making ethical decisions, human rationality isn't a scientist, it's a lawyer.
Benjamin Franklin: "So convenient a thing it is to be a reasonable creature, since it enables one to find or make a reason for everything one has a mind to do."
"Moral dumbfounding:" People know something seems morally wrong - (sibling sex example) - but no one can rationally defend the verdict.
The ultimatum game, a staple of experimental economics. The rules of the game are simple, if a little bit unfair: an experimenter pairs two people together, and hands one of them ten dollars. This person (the proposer) gets to decide how the ten dollars is divided. The second person (the responder) can either accept the offer, which allows both players to pocket their respective shares, or reject the offer, in which case both players walk away empty-handed.
When the dictator cannot see the responder - the two players are located in separate rooms - the dictator lapses into unfettered greed. Instead of giving away a significant share of the profits, the despots start offering mere pennies and pocketing the rest. Once people become socially isolated, they stop simulating the feelings of other people.
People who showed more brain activity in their sympathetic regions were also much more likely to exhibit altruistic behavior. Because they intensely imagined the feelings of other people, they wanted to make other people feel better, even if it came at personal expense. But here's the lovely secret of altruism: it feels good. The brain is designed so that acts of charity are pleasurable; being nice to others makes us feel nice.
Aut is Greek for "self," and autism translates to "the state of being unto one's self."
When people were shown a picture of a starving Malawian child, they acted with impressive generosity. They donated an average of $2.50. When other people were provided with a list of statistics about starvation throughout Africa - more than three million children in Malawi are malnourished, more than eleven million people in Ethiopia need immediate food assistance, and so forth - the average donation was 50% lower. The depressing numbers leave us cold: our minds can't comprehend suffering on such a massive scale. This is why we are riveted when one child falls down a well but turn a blind eye to the millions of people who die every year for lack of clean water. And why we donate thousands of dollars to help a single African war orphan featured on the cover of a magazine but ignore widespread genocides in Rwanda and Darfur. As Mother Teresa put it, "If I look at the mass, I will never act. If I look at the one, I will."
A brain that's intolerant of uncertainty - that can't stand the argument - often tricks itself into thinking the wrong thing.
Why are pundits (especially the prominent ones) so bad at forecasting the future? The sin of certainty, which led the "experts" to mistakenly impose a top-down solution on their decision-making processes.
One of the best ways to distinguish genuine from phony expertise is to look at how a person responds to dissonant data. Does he or she reject the data out of hand? Perform elaborate mental gymnastics to avoid admitting error?
Use your conscious mind to acquire all the information you need for making a decision. But don't try to analyze the information with your conscious mind. Instead, go on holiday while your unconscious mind digests it. Whatever your intuition then tells you is almost certainly going to be the best choice.
The easy problems - the mundane math problems of daily life - that are best suited to the conscious brain. These simple decisions won't overwhelm the prefrontal cortex. In fact, they are so simple that they tend to trip up the emotions.
If the decision can be accurately summarized in numerical terms: let the rational brain take over.
For important decisions about complex items, think less about those items that you care a lot about. Don't be afraid to let your emotions choose.
Simple problems require reason.
Novel problems also require reason. Before you entrust a mystery to the emotional brain, before deciding to let your instincts make a big bet in poker or fire a missile at a suspicious radar blip, ask yourself a question: How does your past experience help solve this particular problem?
If the problem really is unprecedented - if it's like a complete hydraulic failure in a Boeing 737 - then emotions can't save you.
Whenever possible, it's essential to extend the decision-making process and properly consider the argument unfolding inside your head. Bad decisions happen when that mental debate is cut short, when an artificial consensus is imposed on the neural quarrel.
There are two simple tricks to help ensure that you never let certainty interfere with your judgment:
#1 : Always entertain competing hypotheses. When you force yourself to interpret the facts through a different, perhaps uncomfortable lens, you often discover that your beliefs rest on a rather shaky foundation. For instance, when Michael Binger is convinced that another player is bluffing, he tries to think about how the player would be acting if he wasn't bluffing. He is his own devil's advocate.
#2 : Continually remind yourself of what you don't know. Even the best models and theories can be undone by utterly unpredictable events.
Powell: "Tell me what you know. Then tell me what you don't know, and only then can you tell me what you think. Always keep those three separated."
The reason these emotions are so intelligent is that they've managed to turn mistakes into educational events. You are constantly benefiting from experience, even if you're not consciously aware of the benefits. The brain always learns the same way, accumulating wisdom through error. There are no shortcuts to this painstaking process; becoming an expert just takes time and practice. But once you've developed expertise in a particular area - once you've made the requisite mistakes - it's important to trust your emotions when making decisions in that domain. It is feelings, after all, and not the prefrontal cortex, that capture the wisdom of experience.
This doesn't mean the emotional brain should always be trusted. Sometimes it can be impulsive and short-sighted. Sometimes it can be a little too sensitive to patterns. However, the one thing you should always be doing is considering your emotions, thinking about why you're feeling what you're feeling. Even when you choose to ignore your emotions, they are still a valuable source of input.
If you're going to take only one idea away from this book, take this one:
Whenever you make a decision, be aware of the kind of decision you are making and the kind of thought process it requires.
The best way to make sure that you are using your brain properly is to study your brain at work, to listen to the argument inside your head.
You can't avoid loss aversion unless you know that the mind treats losses differently than gains. And you'll probably think too much about buying a house unless you know that such a strategy will lead you to buy the wrong property. The mind is full of flaws, but they can be outsmarted.
The best decision-makers don't despair. Instead, they become students of error, determined to learn from what went wrong. They think about what they could have done differently so that the next time their neurons will know what to do.
Classroom flight school: The problem with this approach, is that everything was abstract. The pilot has this body of knowledge, but they'd never applied it before. The benefit of a flight simulator is that it allows pilots to internalize their new knowledge. Instead of memorizing lessons, a pilot can train the emotional brain, preparing the parts of the cortex that will actually make the decision.
The goal is to learn from those mistakes when they don't count, so that when it really matters, you can make the right decision. This approach targets the dopamine system, which improves itself by studying its errors. As a result, pilots develop accurate sets of flight instincts. Their brains have been prepared in advance.
The ideal atmosphere for good decision-making: where a diversity of opinions are openly shared.
The evidence is looked at from multiple angles, and new alternatives are considered.
Amanda Cook, my editor at Houghton Mifflin Harcourt, was a godsend. She took a messy, convoluted, unstructured manuscript and managed to find the thread that brought it all together. She suggested stories, fixed my prose, and talked me through my confusion. She is the kind of editor - thoughtful, whip-smart, and generous - that every writer dreams about.