Thinking in Bets

 

Intro:
Not a pocker book (poker strategy or gambling). But

 

Thinking in bets starts with recognizing that there are exactly two things that determine how our lives turn out: the quality of our decisions and luck. Learing to recongnize the difference between the two is what thinking in bets is all about.

 

Chapter1: Life is Poker, Not Chess


Pete Carroll (NFL Couch) was a victim of our tendency to equate the quality of a decision with the quality of its outcome.

 

Sometimes things sound as a bad result not a bad decision. (firing an executive)

 

Hindsight bias is the tendency after an outcome is known, to see the outcome as having been ineviatable. (I should have known…..I should have seen it coming)

 

No sober person thinks getting home safely after driving drunk reflects a good decision or good driving ability.

 

When we work backward from results to figure out why those things happened, we are susceptible to a variety of cognitive traps, like assuming causation where there is only correlation, or cherry-picking data to confirm the narrative we prefer.


We all struggle to execute our best intentions. Poker players have the same struggle, with the added challenges of time pressure and immediate financial consequences. That makes poker a great place to find innovative approaches to overcoming this strugle.

 

..But life is more like poker. You could make the smartest most careful decision in firing a company president and still have it blow up in your face.

When the sample is not big enough (like a coin flipped four times) there is a correct answer: “I’m not sure”

 

Embracing “I’m not sure” is difficult. We are trained in school saying “I don’t know” is a bad thing.

What good poker players and good decision-makers have in common is their comfort with the world being an uncertain and unpredictable place. They understand that they can almost never know exactly how something will turn out.

There are many reasons why wrapping our arms around uncertainty and giving it a big hug will help us become better decision-makers.

When we think in advance about the chances of alternative outcomes and make a decision based on those chances, it doesn’t automatically make us wrong when things don’t work out. It just means that one event in a set of possible futures occurred.

 

Decisions are bets on the future, and they aren’t “right” or “wrong” based on whether  they turn out well on any particular iteration.

When we think probabilistically, we are less likely to use adverse results alone as a proof that we made a decision error, because we recognize the possibility that the decision might have been good nut luck and/or incomplete information (and a sample size of one) intervened.

 

Maybe we made the best decision from a set of unappealing choices, none of which were likely to turn out well.

 

Maybe we committed our resources on a long shot because the payout more than compensated for the risk, but the long shot didn’t come in this time.

Redefining wrong allows us to let of all the anguish than comes from getting a bad result. But it also means we must redefine “right”. If we aren’t wrong just because things didn’t work out, then we aren’t right just because things turned out well.

Should we be willing to give up the good feeling of “right” to get rid of the anguish of “wrong”? Yes

 

First the world is a pretty random place. Second, being wrong hurts us more than being right feels good (loss aversion).

 

Chapter 2: Wanna Bet?.


Whenever we choose an alternative, we are automatically rejecting every other possible choice. All those rejected alternatives are paths to possible futures where things could be better or worse than the path we chose. There is potential opportunity cost in any choice we forgo.

 

All decisions are bets: The often  overlooked aspects of betting are choice, probability, risk, ,decision, belief.

 

In most of our decisions, we are not betting against another person. Rather, we are betting against all the future versions of our-serlves that we are not choosing.

 

If we can find ways to become more comfortable with uncertainty, we can see the world more accurately and be better prepared for it.

 

We bet based on what we believe about the world. This is ultimately very good news: part of the skill in life comes from learning to be a better belief calibrator, using experience and information to more objectively update our beliefs to more accurately represent the world.
There are effective strategies to be more open-minded, more objective, more accurate in our beliefs, more rational in our decisions, and actions,and more compassionate toward ourselves in the process.

 

Believing Creatures-> Harvard professor Daniel Gilbert-> Findings from multitude of research literatures converge on a single point: People are credulous creatures who find it very easy to believe and very difficult to doubt. In fact, believing is so easy, and perhaps so inevitable, that it may be more like involuntary comprehension than it is like rational assessment.
Our default setting is to believe what we hear is true.


Flaws in forming and updating beliefs have the potential to snowball. Once a belief is lodged, it becomes difficult to dislodge. It takes on a life of its own, leading us to notice and seek out evidence confirming our belief, rarely challenge the validity of confirming evidence, and ingore or work hard to actively discredit information contradicting the belief. This irrational, circular information-processing pattern is called motivated reasoning.

It doesn’t take much for any of us to believe something. And once we believe it, protecting that belief guides how we treat further information relevant to the belief.

 

As Daniel Kahneman pointed out, we just want to think well of ourselves and feel that the narrative of our life story is a positive one. Being wrong doesn’t fit into the narrative.

 

Acknowledging that decisions are bets based on our beliefs, getting comfortable with uncertainty, and redefining right and wrong are integral to a good overall approach to decision-making.

 

Chapter 3:Bet to Learn: Fielding the Unfolding Future.

 

While experience is necessary to becoming an expert, it’s not sufficient. There is a big difference between getting experience and becoming an expert.

 

The way we field outcomes is predictably patterned: we take credit for the good stuff and blame the bad stuff on luck so it won’t be our fault. The result is that we don’t learn from experience well. “Self-serving bias” is the term for this pattern of fielding outcomes.

 

Despite the difficulties, striving for accuracy through probabilistic thinking is a worthwhile routine to pursue.

 

The benefits of recognizing just a few extra learning opportunities compound over time. The cumulative effect of being a little better at decision-making, like compounding interest, can have huge effects in the long run on everything we do.

 

Chapter 4:The Buddy System

 

We know our decision-making can improve if we find other people to join us in truth seeking. And we know we need an agreement. What’s in the agreement? What are the features of a productive decision-making pod?

Some answers to this questions

Chapter 5 provides a blueprint for rules of engagement with truth seeking groups, how we keep the group from drifting off course, and the productive habits of mind the group can reinforce in each of us.

 

Motivated reasoning and self-serving bias are twi habits of mind that are deeply rooted in how our brains work. We have huge investment in confirmatory thought, and we fall into these biases all the time without even knowing it. Confirmatory thought is hard to spot, hard to change, and, if we do try changing it, hard to self-reinforce.  It is one thing to commit to reward ourselves for thinking in bets, but it is a lot easier if we get others to do the work of rewarding us.

 

Once we are in a group that regularly reinforces exploratory thought, the reoutine becomes reflexive, running on its own. Exploratory thought becomes a new habit of mind, the new routine, and one that is self-reinforced. In a Pavlovian way, after enough approval from the group for doing the hard work of thinking in bets, we get the same good feeling from focusing on accuracy on our own. We internalize the group’s approval, and, as a matter of habit, we begin to do the kind of things that would earn it when we are away from the group (wich is, after all, most of the time).

 

Accountability is a willingness or obligation to answer for our actions or beliefs to others.

 

Diversity is the foundation of productive group decision making, but we can´t underestimate how hard it is to maintain. We all tend to gravitate toward people who are near clones of us.

 

Even research communities of highly intelligent and well-meaning individuals can fall prey to confirmation bias, as IQ is positively correlated with the number of reasons people find to support their own side in an argument. That’s how robust these biases are. We see that even judges and scientists succumb to these biases. We shouldn’t feel bad, whatever our sitiation, about admitting that we also need help.

 

In psychology, there has been a controversy over the last decade about a potentially large number of publshed studies with results subsequent researchers could not replicate. The Reproducibility Project: Psychology has been working on replicating studies from top psychology journals. Anna Dreber, a behavioral economist at the Stockholm School of Economics, with several colleagues set up a betting market based on these replication attempts.

 

A growing number of businesses are, in fact, implementing betting markets to solve for the difficulties in getting and encouraging contrary opinions. Companies implementing prediction markets to test decisions include: Google, Microsoft, GE, Eli Lilly, Pfizer, and Siemens. People are more willing to offer their opinion when the goal is to win a bet rather than get along with people in a room.

 

Corporate Prediction Markets for Innovation Management:

file:///C:/Users/m906523/Downloads/9783319145228-c1.pdf

https://hackernoon.com/corporate-prediction-markets-a-roundup-of-the-major-players-3632b5d3cf09

 

Chapter 5:Dissent to Win

 

Ideal Type model of a self-correcting epistemic community (According to Robert K Merton).

If you  want to pick a role model for designing a group’s practica rules of engagement, you can’t do better than Merton.

 

CUDOS:

Communism: Be a data Sharer. That’s what experts do.

We are naturally reluctant  to share information that could encourage others to find fault in our decision-making.
In my consulting, I’ve encouraged companies to make sure they don’t define “winning” solely by results or providing a self-enhancing narrative. If part of corporate success consists of providing the most accurate, objective, and detailed evaluation on what’s going on (como?), employees will compete to win on those terms.

Agree to be a data sharer and reward others in your decision group for telling more of the story.

 

Universalism:
Nearly any group can create an exercise to develop and reinforce the open-mindedness universalism requires.


If we hear an account from someone we like, imagine if someone we didn’t like told us the same story, and vice versa.

 

That requires open-mindedness o the messages that come from paces we don’t like.

 

Disinterestedness: Avoid conflict of interests.  If two people disagree, a referee can get them to each argue the other’s position with the goal of being the best debater.

 

Organized Skepticism:

Skepticism gets a bump rap becuase it tends to be associated with negative character traits. Someone who disagrees could be considered “disagreeable”. Someone who disagrees could be considered “disagreeable”. Someone who dissents may be creating “dissention” Mayibe part of it is that “skeptical” sounds like “cynical”. Yet true skepticism is consistent with good manners, civil discourse, and friendly communications.


Just as the CIA has red teams and the State Department has its Dissent Channel, we can incorporate dissent into our business and personal lives. We can create a pod whose job is to present the other side, to argue why a strategy might be ill-adivsed, why a prediction  might be off, or why an idea might be ill informed.

 

Chapter 6:Adventures in Mental Time Travel

 

In real-life decision-making, when we bring our past or future-self into the equation, the space-time continuum doesn’t unravel. Far from turning us into a liquefied blob, a visit from past or future versions of us helps present{us make better bets.

 

Just as we can recruit other people to be our decision buddies, we can recruit other versions of ourserlves to act as our own decision buddies.

 

The best poker players develop practical ways to incorporate their long-term strategic goals into their in-the-moment decisions. The rest of this chapter is devoted to many of these strategies designed to recruit past- and future-us to help with all the execution decisions we have to make to reach our long-term goals.

 

Night Jerry: This tendency we all have to favor our present-self at the expense of our future-self i s called temporal discounting. We are willing to take an irrationally large discount to get a reward now instead of waiting for a bigger reward later. Bringing our future-self into the decision gets us started thinking about the future consequences of those in-the-moment decisions. “Hey, don’t forget about me. I’m going to exist and I´d like you to please take that into account.

 

Moving regreat in front of our decisions: Business journalist and author Suzy Welch developed a popular tool known as 10-10-10 that has the effect of bringing future-us into more of our in-the-moment decisions. What are the consequences of each of my options in ten minutes? In ten months? In ten years?

 

Backcasting: When it comes to advanced thinking, standing at the end and looking backward is much more effective than looking forward from the beginning. When we identify the goal and work backward from there to “remember” how we got there, the research shows that we do better.  “Prospective hindsight”, imagining that an evente has already occurred, increases the ability to correctly identify reasons from outcomes by 30%.

 

Premortems:A premortem is an investigation into something awful, but before it happens. We all like to bask in an optimistic view of the future. We are generally biased to overestimate the probability of good things happening. Despite the popular wisdom that we achieve success through positive visualization, it turns out that incorporating negative visualization makes us more likely to achieve our gouls.

 

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s