class: center, middle, inverse, title-slide .title[ # Decision theory ] .subtitle[ ## How To Think - Week 13 and 14 ] .author[ ### Fernando Alvear ] .institute[ ### University of Missouri ] .date[ ### Apr 20 ] --- class: middle, center <script type="text/x-mathjax-config"> MathJax.Hub.Config({ TeX: { Macros: { And: "{\\mathop{\\&}}", Not: "{\\sim}" } } }); </script> # Decision theory The branch of philosophy and economics concerned with the logic of choices. --- # The logic of choices - Good and bad bets - Expected value - Diminishing marginal utility - Fair price of bets - Risk - Exercises - Pitfalls in decision - Outcome framing - Endowment effect - Honoring sunk costs --- # A bet Suppose I offer you the following bet: > Bet 1: “Let's bet on the result of a coin toss. If the coin comes heads, I'll give you $200, but if the coin comes tails, you pay me $10.” Question for you: _Should_ you take this bet? What about this bet? > Bet 2: “If the coin comes heads, I'll give you $200, but if the coin comes tails, you pay me $300.” What about this other one? > Bet 3: “If the coin comes heads, I'll give you $40,000, but if the coin comes tails, you pay me $20,000”. The question in each case is: Should you take this bet? Decision theory has some tools to determine _how good are your choices_. --- # How to know if the bet is good? Advice from philosophers and economists: - Calculate the _expected value_ of each option. - Choose the option with the maximum expected value. .shadow[ .emphasis[ **Calculating expected value**:<br> To calculate the expected value of an option, we multiply each payoff of every possible outcomes by its probability, and then sum up the results. ] ] > Bet 1: “Let's bet on the result of a coin toss. If the coin comes heads, I'll give you $200, but if the coin comes tails, you pay me $10.” - What are the **actions/options** available to you? - What are the possible **outcomes**? - What is the **probability** of such outcomes? - What are the **consequences** (payoffs) of each possible outcome? --- > Bet 1: “Let's bet on the result of a coin toss. If the coin comes heads, I'll give you $200, but if the coin comes tails, you pay me $10.” Let's organize the data from the question in the following decision table: .center[ <img src="assets/decision-table-01.png" alt="" height="250"/> ] Let's calculate the expected value of the options available: $$EV(\text{betting}) = (\$ 200 \times 0.5) + (-\$10 \times 0.5) = \$100 \- \$5 = \$95$$ $$EV(\text{not betting}) = (\$ 0 \times 0.5) + (\$0 \times 0.5) = \$0 + \$0 = \$0$$ --- # How to know if the bet is good? - Calculate the _expected value_ of each option. ✓ - Choose the option with the greatest expected value (maximize expected value) The option with greatest expected value is _betting_ ($95 vs $0). So apparently, you should bet. --- # Intuition behind expected value Question: What does it mean that betting has an expected value of $95? Answer: It means that, if you take this bet several times, every time you bet you will win on average $95. Imagine you play this game with me several times. Sometimes you will win $200, sometimes you will lose $10. But in average, and in the long run, on each coin toss, you can expect to win $95 dollars on each bet. For example, if you take this bet 10 times, you can expect to win around $95 x 10 = $950. --- # Bet 2 > Bet 2: “If the coin comes heads, I'll give you $200, but if the coin comes tails, you pay me $300.” .center[ <img src="assets/decision-table-02.png" alt="" height="250"/> ] Let's calculate the expected value of the options available: $$EV(\text{betting}) = (\$ 200 \times 0.5) + (-\$300 \times 0.5) = \$100 \- \$150 = -\$50$$ $$EV(\text{not betting}) = (\$ 0 \times 0.5) + (\$0 \times 0.5) = \$0 + \$0 = \$0$$ Option with greatest expected value: Not betting! --- # Bet 3 > Bet 3: “If the coin comes heads, I'll give you $40,000, but if the coin comes tails, you pay me $20,000”. .center[ <img src="assets/decision-table-03.png" alt="" height="250"/> ] Let's calculate the expected value of the options available: $$EV(\text{betting}) = (\$ 40,000 \times 0.5) + (-\$20,000 \times 0.5) = \$10,000$$ $$EV(\text{not betting}) = (\$ 0 \times 0.5) + (\$0 \times 0.5) = \$0 + \$0 = \$0$$ Option with greatest expected value: Betting!? Why? --- # Diminishing marginal utility Money, as well as the majority of goods, has diminishing marginal utility: The next dollar you will gain is worth less than the previous dollar you already got. .center[ <img src="assets/marginal-utility-01.png" alt="" height="250"/> ] When something has diminishing marginal utility, each additional unit provides less and less utility. Compare: - Difference between having $0 and $1,000. - Difference between having $1,000,000 and $1,001,000. --- Not just money has diminishing marginal utility, other goods too. .center[ <img src="assets/marginal-utility-02.png" alt="" height="250"/> ] Typical example: food (the first bite is great, the next one is not as great as the first one, and if you eat excessively, it could make you feel worse than how you felt when you started eating) More importantly, the phenomenon of diminishing marginal utility helps us to distinguish between the nominal **value** of something from the __utility__ it has for us. --- # Diminishing marginal utility and Bet 3 > Bet 3: “If the coin comes heads, I'll give you $40,000, but if the coin comes tails, you pay me $20,000”. - If all you have is $20,000, those $20,000 are much more valuable to you that the next $40,000 you will get if you bet and win. - You would lose more utility if you lose, compared to the utility if you win. - If you have a lot of money, let's say $1,000,000, the utility of winning $40,000 is greater than the utility of the $20,000 you already have, so it would be reasonable for you to take the bet. --- # Summary .shadow[ .emphasis[ **Calculating expected value**<br> To calculate the expected value of an option, we multiply each payoff of every possible outcomes by its probability, and then sum up the results. ] ] .shadow[ .emphasis[ **Expected value rule**<br> Act so as to maximize expected value (choose the option with the greatest expected value). ] ] .shadow[ .emphasis[ **Diminishing marginal utility**<br> Property of a good, such that the more you have of it, the less utility it has for you. ] ] --- # Fair price Consider this bet again: > Bet 1: “Let's bet on the result of a coin toss. If the coin comes heads, I'll give you $200, but if the coin comes tails, you pay me $10.” This bet is very good for you, but bad for me. $$EV(\text{betting for you}) = (\$ 200 \times 0.5) + (-\$10 \times 0.5) = \$100 \- \$5 = \$95$$ $$EV(\text{betting for me}) = (\$10 \times 0.5) (-\$ 200 \times 0.5)= \$5 -\$100 = -\$95$$ If we both want to play this game in fair conditions, then I should charge you some money to enter into this bet. This is the fair price of the bet, which is equal to its expected value. If we include the fair price as a loss for you, the expected value of the bet is $0. $$\begin{aligned} EV(\text{betting for you}) &= ( (\$ 200 -\$95) \times 0.5) + ( (-\$10-\$95) \times 0.5) \\\ &= ( \$ 105 \times 0.5) + ( -\$105 \times 0.5) \\\ &= \$52.5 \- \$52.5 = \$0 \end{aligned}$$ --- # Risk > Option 1: “Let's bet on the result of a coin toss. If the coin comes heads, I'll give you $200, but if the coin comes tails, you pay me $10.” > Option 2: “I'll give you $95. No questions asked.” Which option is best? $$EV(\text{option 1}) = (\$ 200 \times 0.5) + (-\$10 \times 0.5) = \$100 \- \$5 = \$95$$ $$EV(\text{option 2}) = \$ 95 \times 1 = \$95$$ The fact that many people choose option 2 as opposed to option 1 shows that people are usually **risk-averse**. Rarely people are **risk-prone**. --- # Caveats on applying the expected value rule To calculate the expected value of an action or outcome, it's good to start by identifying the following pieces of information: - Options/actions - Outcomes - Probability of outcomes - Payoffs/consequences of each outcome The expected value rule says you should do the action with the greatest expected value. However, it's best to take this rule as a heuristic (a rule of thumb) and not as an irrefutable mandate. As we saw before, decisions can be legitimately affected by other factors, such that: - The diminishing marginal utility of what's at stake. - The _real_ utility of what's a stake. - Your tolerance to risk. ... and perhaps other factors as well. --- # Exercise: expected value of travel time You have a job interview in Ottawa, Canada, in midwinter. There are two ways to go: train or plane. A bad storm is predicted with probability 0.2. The storm will not affect the time required to get from your home to the departure point, or from the arrival point to your place of interview. - Train: 30 mins. to train station, departure 10 mins. later, 20 mins. from station to interview. (= 1 hr.) - Plane: 80 mins. to airport, departure 1 hr. later, 40 mins. from airport to interview. (= 3 hrs.) If there is no storm: - Train time: 5 hrs. - Plane time: 1 hr. If there's a bad storm: - Train time: 7 hrs. - Plane time: plane grounded by 10 hrs., then 1 hr. of flight. --- - Options/actions: Train or plane. - Outcomes: Bad storm or not bad storm. - Probability of outcomes: 0.2 and 0.8 - Payoffs/consequences of outcomes: - If no storm: Train 5h / Plane 1h - If storm: Train 7h / Plane 11h - Additional info: - Extra time train: 1h / Extra time plane: 3h This is best described in the following decision table: .center[ <img src="assets/decision-table-04.png" alt="" height="250"/> ] `$$\begin{aligned} EV(\text{Train}) &= ( -7h \times 0.2) + ( -5h \times 0.8) + (-1h) = -6.4h = -6h, 24m \\\ EV(\text{Plane}) &= ( -11h \times 0.2) + ( -1h \times 0.8) + (-3h) = -6h \end{aligned}$$` --- The expected value rule, since `\(EV(\text{Train}) < EV(\text{Plane})\)`, recommends taking the plane. However, there might be other factors. - Best case scenarios - Train: 5hr + 1h = 6h - Plane: 1hr + 3h = 4h - Worst case scenarios - Train: 7hr + 1h = 8h - Plane: 11hr + 3h = 14h If you would like to avoid spending 14 hrs. just travelling, take the train! --- ## Roulette .pull-left.w40[ <img src="assets/roulette.jpg" alt="" height="550"/> ] .pull-right.w60[ 1. Single number bet pays 35 to 1. Also called “straight up.” 2. Double number bet pays 17 to 1. Also called a “split.” 3. Three number bet pays 11 to 1. Also called a “street.” 4. Four number bet pays 8 to 1. Also called a “corner bet.” 5. Five number bet pays 6 to 1. Only one specific bet which includes the following numbers: 0-00-1-2-3. 6. Six number bets pays 5 to 1. Example: 7, 8, 9, 10, 11, 12. Also called a “line.” 7. Twelve numbers or dozens (first, second, third dozen) pays 2 to 1. 8. Column bet (12 numbers in a row) pays 2 to 1. 9. 18 numbers (1-18) pays even money. 10. 18 numbers (19-36) pays even money. 11. Red or black pays even money. 12. Odd or even bets pay even money. ] --- At most casinos in North America, a standard roulette wheel has 18 reds, 18 blacks, and 2 zeros (colored green). A simple bet of $1 on red pays back $2 if and only if the wheel stops at a red segment (you net win is then $1). If the wheel stops at another segment, you lose what you bet. What is the expected value of betting $1 on red? If you do the same bet 10 times, how much money would you expect to win or lose? --- # Cognitive pitfalls when making decisions - Outcome framing - Endowment effect - Honoring sunk costs --- # Outcome framing > Suppose you are a doctor helping a patient choose a treatment for lung cancer. Both treatments have the following chances of fatal complications in the first month: > Treatment A has 90 percent survival rate in the first month. <br> Treatment B has a 10% mortality rate in the first month. Which treatment would you recommend to your patient? Both treatments have the same chances of survival, but studies show that people choose more treatment A. Why? What can explain this? - Inherent risk-avoidance - System 1 / System 2 --- .shadow[ .emphasis[ **Outcome framing**<br> Phenomenon in which our decisions are influenced by the way the data is presented. Equivalent information can be more or less attractive depending on what features are highlighted. ] ] .center[ <img src="assets/outcome-framing.jpg" alt="" height="400"/> ] --- # Endowment effect In a famous study, one group receives a mug for free, and then they are asked if they would like to trade it for a chocolate bar. Another group is offered the option of a free mug or a chocolate bar. Most participants in the first group refuse to let go off the mug for the chocolate bar, while most participants in the second group prefer the chocolate bar more than the mug. .center[ <img src="assets/endowment-effect.jpg" alt="" height="300"/> ] --- This experiment shows that, when we think of something as belonging to us, we assign it much more value than if we were to acquire it ourselves. .shadow[ .emphasis[ **Endowment effect**<br> Phenomenon in which we assign more value to something just because we own it, compared to something we don’t. ] ] Other examples: - Free trials or samples in marketing. - Hoarding disorders. --- # Honoring sunk costs > Imagine that you bought a concert ticket a few weeks ago for $50. You then learn that the show is boring as the band plays only new songs no one knows. On the day of the concert, you feel sick and it’s raining outside. You know that traffic will be worse because of the rain and that you risk getting sicker by going to the concert. Although it seems as though the current drawbacks outweigh the benefits, you decide to go to the concert so as to "not waste the tickets." Is this decision (going to the concert) rational? - Consequences of staying at home: 0. - Consequences of going to concert: - Enjoyment: minimal - Discomfort: substantial --- .shadow[ .emphasis[ **Honoring sunk costs**<br> A sunk cost is a cost that one has already incurred and cannot be recovered. Honoring a sunk cost is about taking those irrecoverable costs into account when making a decision. ] ] In economic terms, sunk costs are costs that have already been incurred and cannot be recovered. In the previous example, the $50 spent on concert tickets would not be recovered whether or not you attended the concert. It therefore should not be a factor in our current decision-making. Other examples: - Finishing watching a boring movie - Business decisions - Concorde > In 1956, the Supersonic Transport Aircraft Committee met to discuss building a supersonic airplane, the Concorde. French and British engine manufacturers and French and British governments were involved in the project that was estimated to cost almost 100 million dollars. Long before the project was over, it was clear that there were increasing costs and that the financial gains of the plane, once in use, would not offset them. --- # What do these pitfalls have in common? All of these pitfalls share a common issue: in deciding what to do, they introduce a factor that is _irrelevant_ to the goodness or badness of a decision. - In the case of outcome framing, the framing in which the outcomes are cast is irrelevant to how much we value them. - In the case of the endowment effect, the fact that we own something is irrelevant to the value we assign to it. - In the case of honoring sunk costs, the already incurred and now unrecoverable costs are irrelevant to the expected value of the current options.