Wednesday, July 20, 2011

Irrational Risk Avoidance

Behavioural economists have found many ways to expose our irrational tendencies. Given a choice between two options, people are sometimes too conservative and at other times too reckless.

One example is whether you’re willing to play a game where you toss a coin and win $20 for heads or lose $10 for tails. From a purely mathematical point of view, almost all people should be willing to play this game. But many are not. Of course the reasons for refusing to play may have nothing to do with math. Some people object to gambling. Others fear that the game is somehow rigged.

Usually, I can overcome any initial irrational feelings about these games to figure out which choice the researchers consider to be the correct choice. However, there is one game that I have a hard time with:

Suppose that you win a prize and your reward is one of the following two choices:

1. $3000 for certain.

2. $4000 with a probability of 80% and nothing with probability 20%.

It turns out that a strong majority of people choose the certain $3000. An insurance company with deep pockets would say the expected value of choice 2 is 80% of $4000 or $3200, which is more than $3000, and therefore choice 2 is better. For those of us without effectively unlimited resources, the calculation is a little more complicated.

To take into account the cost of volatility, it’s best to look at the expected compound return. (For mathy types this is the expected value of the logarithm of your net worth.) It turns out that even if your net worth is only $5000, you should go for choice 2. The majority of us who have much more to our names than $5000 should have a strong preference for choice 2.

However, I can’t shake the feeling that choice 1 feels better. In fact, if I were actually given a chance to play this game only once, I’d probably take the sure $3000. For most of the other test questions I’ve seen researchers use, I’m fairly confident that I could make the more rational choice, but not for this question. So chalk one up for the ancient brain that sees losses in terms of getting eaten by a lion.

19 comments:

  1. Your idea of using expected log of net worth to account for volatility is interesting. Can you explain further the motivation for this particular metric?

    ReplyDelete
  2. @Anonymous: I can't claim credit for this idea -- it has been around since long before I was born. This method of computing returns gives the geometric return (or compound return). If you take a sequence of yearly returns and just take their (arithmetic) average, the result will be different from the compound average return. Consider an example where you lose 20% one year and gain 30% the next. The arithmetic average return is 5%. But if you started with $1000, you'd end up with $1040 for a compound avcerage return of just under 2% per year.

    ReplyDelete
  3. I think the $3,000 sure thing is the smart choice.

    Going with the probabilities makes sense if this contest is repeated over and over again. Over the long haul, you should average $3,200 per contest with choice #2 which is better than $3,000.

    But if you are only entering the contest once - I think it's too risky to choose #2.

    Choice #2 (one time only) is kind of like investing in equities for your house downpayment. I don't care what the expected returns are - it's just too risky with a major downside if it doesn't work out.

    ReplyDelete
  4. @Mike: My gut agrees with you, but the math says we're both just being emotional. If the payoffs were 100 times bigger it might make sense to go for the safer choice, but for just $3000 or $4000, people of average means are supposed to take the chance. Failing to receive $3000 isn't very major compared to a high probability of getting an extra $1000.

    What's interesting is that you can get a different answer from people if the question is framed differently. Suppose that 4000 people have a disease that is killing them. One treatment is guaranteed to kill 1000 of them and save the rest. The other treatment has an 80% chance of saving all of them and a 20% chance of killing all of them. Which treatment do you choose? The numbers are the same as the other game, but more people are willing to take the gamble here to try to save all 4000 people.

    ReplyDelete
  5. @Michael: you know my opinion on this. The math doesn't say you're being emotional unless you're using the wrong math. The concept of expected value is based on the law of large numbers, and so it is inapplicable to a choice you're only making once.

    ReplyDelete
  6. Screw the math!! :)

    I understand that someone with a lot of money should pick #2, but another way to look at it, is that the difference in the 2 positive outcomes is only $1,000 which isn't much for someone who is doing well financially.

    In my mind, $4,000 isn't that much more than $3,000 when I consider the possibility that I might not get anything if I go for the $4,000 prize.

    @Patrick - I had forgotten about that post. Looks like we agree.

    ReplyDelete
  7. @Patrick and @Mike: There is nothing wrong with using an expected value for a choice you only get to make once if you get the utility of the outcomes right. Using expected compound returns is one attempt to assign utility values. My assertion is that a rational assessment of utility for most people would indicate that they should take the chance, even though their gut tells them they shouldn't.

    I'd be interested to know how you would react to the reframing of the question from my earlier comment where you're saving lives rather than making money. This one feels very different to me even though the math is similar.

    ReplyDelete
  8. I would make the same choice for your other question. I'd rather save 3,000 people for sure.

    That's a decision I hope I never have to make.

    ReplyDelete
  9. What is being missed in all these arguments is neglecting the marginal utility of money. Because we chase money for the utility it provides, and money has diminishing marginal utility, a dollar gained can not be the same as a dollar lost, in terms of utility.

    Mathematical expectations theory misses this point if it doesn't account for utility.

    Say someone offers you a 99% chance of making a million dollars and a 1% chance of losing a million dollars. Do you take the bet? It depends. If your net worth is a million dollars, I would think that you would not as the downside, even if a small chance, wipes you out completely. Now suppose your net worth is ten million dollars, you would take the bet because the worst case is that you end up with nine million dollars which wouldn't affect your lifestyle at all.

    In both cases, expectations are the same but marginal utility is different.

    ReplyDelete
  10. There is this annoying feature if faced with this in reality:

    Can you TRUST the people handling the money?

    Would they truly give you $4,000 80% of the time? or would it be more like 50% of the time.

    Reality is different from the theoretical discussion.

    ReplyDelete
  11. @Ahmed: It's true that we have to take into account the utility of money. Using log of net worth is an attempt to model utility.

    @Mark: Whether you believe the game is as advertised is certainly an important factor. I've been assuming here that the details of the game are such that the 80% chance is believable.

    ReplyDelete
  12. @CC: It would be interesting to find out how much lower than 20% we can take the probability of getting nothing and still have a strong majority of people prefer the sure $3000.

    Perhaps this preference for avoiding large losses no matter the probabilities are explains lottery tickets (at least in part). With the loss capped at a few dollars people seem willing to go for an infinitessimal shot at riches.

    ReplyDelete
    Replies
    1. The comment above is a reply to Canadian Capitalist's comment:

      I'll choose #1 for two reasons: I'm offered a single pick, not a lot of picks and I see #2 as a choice that can gain me $1,000 or lose me $3,000. In other words, #2 is a choice between a little extra cheer and a lot of pain. Like most people, I prefer avoiding pain if possible.

      Delete
  13. @Michael: You can assert all you want that expected values are relevant to one-time-only choices, but that doesn't make it true. We may just have to agree to disagree on this one.

    ReplyDelete
  14. @Patrick: Let me turn this one around a little. When confronted with one-time choices, we do make selections. These selections are based on something. This something has to do with the possible outcomes of each possible choice and the value we assign to the outcomes. All this can be recast as an expected value computation using some utility function. So, if you can describe how you make your choice, I can show how your decision making can be modeled with a utility function and an expected value.

    Very often when people talk of expected values, the implication is that they are using a linear utility function. This approach values choice #2 at $3200. However, this calculation ignores the fact that each additional dollar I accumulate is less valuable to me than the last. But this doesn't mean that the expected value has failed; it is the utility function that has failed.

    ReplyDelete
  15. @Michael: I hadn't thought of that. So I'll grant that any (computable) decision criteria can be recast as a utility function. But that's a long way from concluding that the certain $3000 is irrational.

    ReplyDelete
  16. Could you please explain how you use the log of net worth in this case?

    ReplyDelete
  17. @Patrick and @Anonymous:

    Here's an example of how to use log of net worth for someone whose net worth is $50k:

    Choice 1: net worth increases to $53k

    Choice 2: net work either goes up to $54k (80%) or stays at $50k (20%).

    Using natural logarithms, the computation is

    e^(0.8*ln(54000)+0.2*ln(50000))
    = $53175

    So, based on log-utility, choice 2 is worth $175 more than choice 1 for someone starting with $50k.

    This is my basis for saying that taking choice 1 is irrational for most people. If you don't think this result is reasonable, then I'm interested in what utility function you think makes more sense.

    ReplyDelete
    Replies
    1. I happened to reread this post and I think I have another useful insight. The math says that the riskier choice is worth something close to $175 more than the safe choice, but this ignores emotions.

      If I were the type of person who just forgets about things I can't change, then the second choice is rational. However, if I were to brood over taking the risky choice and losing, the extra $175 might not be enough to compensate me for the possibility of feeling bad for a long time.

      If I were the type to stay depressed over a perceived loss for quite a while, I might actually value the bad outcome of the risky choice as minus $1000 rather than just zero. This would be enough to make me prefer the safe choice.

      I doubt I could brood that much, so I'd go for the risky choice, but I suppose others may actually brood that much.

      However, I doubt that people go through this calculation when they pick the safe choice. I think it's far more likely that avoiding risk is baked into our lizard brains.

      Delete