How to Make Better Decisions
On Paradoxes, Cognitive Biases, and Irrationality
The purely rational economic man is indeed close to being a social moron.
Why should you care?
If you work for a corporation or startup and care about being impactful in your decision making, this article is for you!
Being a knowledge worker means that your decisions are meant to drive outcomes and understanding the flaws that our brains have can help you make better decisions.
Thanks for reading Chaos Engineering! Subscribe for free to receive new posts and support my work.
More generally, I think everyone should care about the defects of our cognition as it will heavily impact your personal life in a surprising number of ways. So hopefully you’ll enjoy this read and be able to make better choices for you and your loved ones!
I find human behavior and decision making wildly fascinating...and mostly comical. This is because of the variety of paradoxes, cognitive biases, and irrationalities that are constantly at play in our micro and macro conclusions.
In fact, these illogical behaviors are largely what led me into the career I have today. It all started with economics and statistics (my first graduate degree and love) and I've spent over a decade working in quantifying and predicting human behavior. Maybe the tools I use now are fancier than when I started but, at my core, I am still fascinated by human behavior, data, and statistical inference—the methods that provide a glimpse of understanding about a person's choices.
What I've found is that humans are just kind of silly—in the sense that what we want to do and what we actually do are often inconsistent.
This is well studied in different academic disciplines (particularly behavioral economics and psychology) but it isn’t common knowledge by people outside of that circle. So I thought I'd write about some of my favorite paradoxes and biases and why they're important.
It's worth stating that even if you are aware of these cognitive flaws, you are still quite likely to make the mistakes they call out—I do all of the time.
I want to over-emphasize that my own decision making is nothing to brag about but I decided to write about this because I thought it'd be fun and I love this topic.
Lastly, I have written these in the order I've found the most useful in my personal and professional life—this, too, is ridden with bias.
01. The Dunning-Kruger Effect
The Dunning-Kruger Effect is one of the most important cognitive biases that exist, probably because of how impactful it is to everyone (i.e., we all suffer from the burden of incompetent people and it is quite likely we, too, are someone's burden).
In short, people with low ability at a given task tend to overestimate their ability and those with high ability tend to underestimate it.
If you pay attention, you will see this occurrence often...especially from those without much experience in a given area of expertise (though this is not always true).
You may see this often from some people at big companies...run. 😅
02. The Double Standard
This is probably my favorite bias because people (like me, for example) commit this often and it's such a subtle yet common thing. Definitionally, a Double Standard is "the application of different sets of principles for situations that are, in principle, the same."
This is very important professionally as people often have unrealistic expectations of others that they wouldn't have of themselves. I’ve found this comes up when those lacking domain expertise are frustrated by timelines for building pretty much anything. You’ll often hear people gripe, "Why does this take so long?"
As a manager, I regularly ask myself "If I were doing this, would I expect the same outcome within the same time?" and I find that this helps me better empathize and be more realistic about the outcome and timeline.
More importantly, my colleagues probably find me more tolerable. :)
03. The Curse of Knowledge
I say this non-ironically: something is obvious when you know it, and not if you don't—so too the definition of the Curse of Knowledge.
This is something I experienced a lot in my career because people often forget all of the context they have when referencing something. Business is very jargony so when I onboard people or explain things I very explicitly try to avoid using acronyms or cryptic language. It certainly takes mental effort but it makes working with me much less frustrating for the other party.
Also, I find that assuming someone knows something or being surprised that someone doesn't know something can sound extremely condescending, so it’s probably best to avoid that.
04. Simpson's Paradox
I could write an entire post about Simpson's Paradox but, to keep it brief, Simpson's Paradox is a statistical phenomenon in which a correlation between two variables can be reversed by the addition of another.
But how??? Time for a graph!
Above variables X and Y are negatively correlated, quite strongly too with a correlation coefficient of -0.74. But what if there was some other group variable Z which represented 5 groups, we would then be able to see:
Oh no! The exact opposite conclusion! It's worth noting that in both statistics and life, you may never know of the existence of Z (i.e., some other random variable influencing the direction of your conclusions).
So while this is useful for regression and statistical inference, I find this paradox to be applicable to many more situations.
Said another way, I may always be missing a single, critical piece of information that may flip my conclusion. So I tend to calibrate my opinions accordingly.
05. The Sunk Cost Fallacy
As elegantly written by The Decision Lab, "The Sunk Cost Fallacy describes our tendency to follow through on an endeavor if we have already invested time, effort, or money into it, whether or not the current costs outweigh the benefits."
Emotion often clutters our ability to understand the actual expected value/reward of a given thing we are putting effort into but sometimes it is in our best interest to cut our losses rather than see it through.
It rarely feels good but can often be the optimal decision.
06. Loss Aversion
Loss Aversion is simply the disproportional weight a person often places on minimizing losses to acquire economic gains.
For example, someone may prefer to take $10 with 100% certainty rather than $20 with 90% certainty because the displeasure from that 10% possibility outweighs the pleasure they'd receive from the additional $10 (or expected $8 = (0.9 * 20) - 10).
This is extremely irrational and puts non-disciplined investors at a mathematical disadvantage.
07. The Gambler's Fallacy
The Gambler's Fallacy refers to the incorrect belief that a given event is more or less likely given a previous sequence of events when the event is not a function of time.
This can be seen through coin flips. If you see 5 heads flipped in a row, you may think that a tails is "due" but this is incorrect (assuming a fair coin) since coin flips are always independent (i.e., one flip doesn't depend on the next).
Pulling again from the Decision Lab, "Anchoring is a cognitive bias that causes us to rely too heavily on the first piece of information we are given about a topic. When we are setting plans or making estimates about something, we interpret newer information from the reference point of our anchor, instead of seeing it objectively."
This is often used in marketing and pricing to delude you into thinking something is on sale. :')
09. Sample Bias
Sample Bias originates from statistics and is a result of a flawed collection of an intended random sample.
This is particularly common in business and Twitter where people think their customers or Twitter poll-responders are representative of the entire population.
They're not and this can often lead to very poor decision making or conclusions.
10. Assignment Bias
Assignment Bias is similar to Sample Bias in that it is a bias in the sample but is rooted in a broken assignment system. For example, imagine an experimental drug trial where the "random assignment machine" (i.e., a machine that assigns things at random) only treated the young and healthy—obviously that’s biased—and while that is a pathological and extreme example it highlights the issue.
By the way, it turns out that a good "random" sample is extremely hard to collect in the real world—ask the Census.
11. Self-Selection Bias
Self-Selection Bias is another form of sample bias but it's caused by the participants choosing whether or not to participate in the experiment, treatment, survey, or what have you.
In the example above imagine instead that the "random assignment machine" only treated the people who wanted to be treated and not the ones that did, well that obviously is going to bias the effects of a study. So, similar to the previous case it would ruin the experiment.
12. Decision Fatigue
Decision Fatigue is a phenomenon whereby an individual's decision making quality deteriorates after a long session of decision making.
In short, you get tired of making choices and you start to get sloppy. In the business and investing world, this is extremely consequential because your or your investor's money is on the line.
13. Optimism Bias
To quote Wikipedia, "Optimism Bias is a cognitive bias that causes someone to believe that they themselves are less likely to experience a negative event. It is also known as unrealistic optimism or comparative optimism."
It is good to be optimistic but it is good-er to balance it in reality.
14. Response Bias
Response Bias is both interesting and counterintuitive.
It is a catch-all for the frequent tendency of participants to respond inaccurately or falsely to survey questions. This is part of the reason surveys often conflict with reality.
As a statistician, I feel surveys are kind of useful but behaviors reveal the truth. Measure behaviors.
For internet companies, you may find that user-survey data conflict with tracking metrics that you have for your customer. What people say and what they do are often wildly different.
15. The Accuracy Paradox
Lastly, The Accuracy Paradox is the paradoxical finding that accuracy isn't necessarily a good metric for measuring statistical or machine learning models.
This is because of an imbalance of outcomes.
Suppose I was trying to predict whether someone had a rare illness (10 out of 100,000 people have it), if I predicted everyone didn't have it I'd still have 99.99% accuracy.
So, accuracy can sometimes be quite useless as a metric for assessing the quality of things in general (though not always).
Managing the Irrational
You probably can't completely stop yourself from irrational decision making but you can possibly manage it.
My approach is simple: acknowledge the biases above, the idiosyncratic ones I have from my life experiences, and reflect frequently. I find that this causes me to change my mind often.
This can be frustrating but I think that early reactions or understandings are often not the optimal ones so I try to put effort into reflecting so that I can get better outcomes.
When it comes to human judgment being objective is nearly impossible, and I'm not even exactly sure what “objective” really means outside of mathematics.
But my advice on being as objective as possible is to write things down: bullet points, a simple pros and cons list, or whatever can highlight flaws in your reasoning.
I find this brings me mental clarity more than anything else. I also tend to write things down on paper the good ol' fashion way1.
Managing cognitive biases and mitigating the adverse consequences they may hold is extremely challenging but a well worthy endeavor, as good decisions compound like great investments.
But don't worry too much if you find yourself struggling with it, we are all human after all. 😉
Did you like this post? Do you have any feedback? Do you have some topics you’d like me to write about? Do you have any ideas how I could make this better? I’d love your feedback!
Feel free to respond to this email or reach out to me on Twitter!
Maybe this is an old habit grounded in problem sets, who knows.