Flaws in Thinking
Every cognitive bias is there for a reason — primarily to save our brains time or energy. If you look at them by the problem they’re trying to solve, it becomes a lot easier to understand why they exist, how they’re useful, and the trade-offs(resulting mental errors) that they introduce.
Cognitive Biases addresses these issues...
- Information overload
- Lack of meaning
- The need to act fast
- How to know what needs to be remembered for later.
1. Information overload
Brain filters out information that it thinks is not important. There is too much information available - its not practical to process all of it.
Notice only that are primed or repeated
- Availability Heuristic
- Attentional bias: What we believe/want influcenses what we focus on. Eg. people addicted to smoking are more attentive to smoking cues.
- Illusory truth effect: We believe incorrect information to be correct after repeated exposure. Eg. Advertising, Propaganda
- Mere-Exposure Effect
- Context effect/Cue-dependent forgetting: Cannot recall information without memory aids/cues. Cognition and memory is dependent on context.
- Frequency illusion/Baader-Meinhof Phenomenon: Once you learn a new word/concept, you see it everywhere
- Hot-Cold Empathy Gap
- Recency bias: More weightage is given to a recent event/data/evidence when compared to past events/data/evidence
Bizarre/funny/visually-striking/anthropomorphic are more Noticeable
tags: cbias-notice-specific
We generally skip information that's ordinary or expected.
- Bizarreness effect/Humor effect: We remember bizarre material better.
- Von Restorff effect/Isolation Effect: If there are multiple similar stimuli, we remember the one that differs from the rest.
- Negativity bias: Negative stimuli have bigger impact on the mental state
- Publication bias: Bias that favors positive results in published academic research.
- Omission bias: We favor an act of omission/inaction over commission/action.
Change is Noticed Prominantly
tags: cbias-notice-change
- Anchoring Effect: Choices are affected by an anchor. Eg. people are more likely to purchase something if it is placed with a more expensive option (the anchor).
- Framing effect: People's choices can change based on how the question is framed. Eg. People will select something if adviced as 80% fat-free instead of 20% fat.
- Weber–Fechner law: Percived difference is not the same as actual difference. We think 10 to 20 is a bigger change than 1000 to 1010.
- Distinction bias: When evaluating between two options, we view them as very different - as compared to evaluating them separately(we would have evaluated them very close to each other).
Drawn to details that confirms existing beliefs
tags: cbias-notice-confirmation
And ignore/miss details that contridicts our beliefs
- Confirmation Bias
- Congruence bias: People over-rely on their initial hypothisis. Can be solved by trying to falsify initial hypothisis.
- Choice-supportive bias/Post-purchase rationalization: People justify a past decision by subconsiously giving it positive attributes.
- Selective Perception: Not notice/quickly forget things that cause us emotional discomfort.
- Observer-expectancy effect/Observer effect: Experimenters interpreting results incorrectly because they have a pre-existing hypothisis.
- Ostrich effect: Avoiding negative information that can cause discomfort. Eg. Don't check message because you are expecting some bad news.
- Subjective validation: Beliefs hypothisis because it has personal meaning/significance to them.
- Continued influence effect: Continue to beliefs wrong information even after learning that its wrong.
- Semmelweis reflex: A "reflex" like tendency to reject opposing ideas to what you believe
- Bucket error: Lumping a decision to a related but not necessarily causing result. Eg. I miss-spelled a word - so I cannot be a writter.
- Law of narrative gravity: Public and press likes narratives. More widely accepted a narratives is, the more it shapes the perception of facts.
- Motivated Reasoning
We notice flaws in others easier than in ourselves.
tags: cbias-notice-others-flaws
- Fundamental Attribution Error
- Bias blind spot: Its more difficult to notice biases in ourselves
- Naïve realism: Belief that our view of the world is objective, and people who disagree are irrational/biased/misinformed.
- Naive Cynicism: Expecting others to be biased all the time
2. Not enough meaning
tags: cbias-need-meaning
The world is too complex to understand fully. So we compensate by filling in the gaps of our understanding to make better sense of it - or at least have a belief that we have an understanding of the world. We assign meaning to the world - we do our own sensemaking.
We find patterns and meaning even with little data
tags: cbias-meaning-from-little-data
Our brain needs to feel that it has a coherent model/story about the situation - even if we have too little information about it.
- Confabulation: Memory error - people sometimes have wrong/distorted memories that they are confident about.
- Clustering illusion: belief that streaks or clusters in parts of random data to be non-random
- Insensitivity to sample size: People tend to ignore sample size of data. They forget that variation is more likely in smaller sample sizes.
- Neglect of probability: Tendency to ignore probability when making decisions in uncertain conditions. Small risks are either ignored or over exaggerated. Eg. Risk of death due to nuclear power plant failure.
- Anecdotal fallacy: Belief that personal experience/anecdotal examples trumps data or logical argument.
- Illusion of validity: Overestimation of ability to interpret and predict outcome when analyzing data that shows a consistent pattern. If we feel it tells a coherent story, we are very confident of our prediction.
- Gambler’s fallacy: If an unlikely event(that's statistically independent) occurred multiple times, its less likely to occur in the future. Eg. A dice that rolled 6 multiple times will not roll more 6-s.
- Hot-hand fallacy: Belief that someone who has been successful will be more likely to be successful in future attempts.
- Illusory correlation: Perceiving a relation between things(people, behaviors, events, etc) when no such relation exists.
- Pareidolia: Seeing a pattern/meaning where there is none. Eg. seeing a face on the cloud.
- Anthropomorphism: Assigning human traits, attributes, emotions or agency to non-human things.
We jump to conclusions using stereotypes, generalities, past occurrences.
tags: cbias-meaning-from-stereotypes
When we have only limited information/gap in knowledge we tend to fill in the gaps using best guesses from stereotypes and generalities. Once its done, we can't easily determine which part is real and which part is filled up.
- Group attribution error: Belief that the characteristics of one person in a group must be there in all. Also, belief that group's decision is the same as all individual's decisions.
- Ultimate attribution error: Explains the negative behaviors of outgroup(others) to personality defects and negative behaviors of the ingroup(us) to external circumstances or chance. Fundamental Attribution Error at group level.
- Stereotyping: General belief about a group of people - and expects that belief to be true of all individuals in the group.
- Essentialism: Philosophical view that all things have a set of properties that are necessary to their identity. Comes from Plato - all things have an "essence".
- Functional fixedness: Cognitive bias that limits your imagination of how an object can be used to only its traditional use. Eg. a hammer can be used as a paper weight - but if you have functional fixedness, you might not be able to see it.
- Moral credential effect/Self-licensing: A previous "good" behavior will make it easier to do "bad" behavior. Eg. I just exercised, so I can have a pizza.
- Just-world hypothesis: Belief that people will get what they deserve. Or Everything happens for a reason.
- Argument from fallacy: The idea that since an argument had a logical fallacy in it, it must be false. Eg. Even if the argument had a slippery slop fallacy in it, it might be true.
- Authority bias: Belief that the views of an authority figure(Eg. God, Govt, Parent) is more accurate - and let it influence your options.
- Automation bias: Tendency to believe decisions from an automated decision making system have more accuracy. And even ignore contradictory information made without automation.
- Bandwagon effect: Tendency to follow the crowd. Adopting behaviors, practices, attitudes, beliefs only because others are doing it.
- Placebo effect: A inert pill can cure health issues if the patient believes that it will.
Belief that liked or known things are better
tags: cbias-liked-or-known
Belief that people/things we like or familiar with are better than that we don't like/are familiar with.
- Halo effect: Tendency to transfer the good impression of a person/company/brand/etc in one context to things they recommend in another context. Eg. If you like Ronaldo as a footballer, you might like the shaving cream that he advertises.
- In-group bias: Favoring people of your in-group over people outside.
- Out-group homogeneity bias: The belief that the people in the out-group are very similar to each other. "They are all alike; we are diverse."
- Cross-race effect: Tendency to recognize faces from your own race more easily when compared to recognizing faces from other races.
- Cheerleader effect/group attractiveness effect: Tendency to believe that individuals are more attractive when they are in a group.
- Well-traveled road effect: Tendency to think that traveling to an unfamiliar place has taken longer - but an equal time journey in very familiar route would feel shorter.
- Not invented here: Tendency to avoid things with an external origin.
- Reactive devaluation: Tendency to devalue ideas/proposals that comes from an "enemy". Eg. In US, more people favored an arms deal if they were told it was proposed by the US than if they were told it came from the Russian side.
- Positivity effect: Tendency to seek the positive.
Simplification of Probability and Numbers
tags: cbias-math-simplification
Subconscious mind is bad at maths - and uses simplification to optimize decision making. This can get wrong results.
- Mental accounting: People tend to assign subjective value to money - this is suseptable to biases, thinking flaws. Eg. If you find 100$ on ground, you treat yourself to an expensive dinner - but if you had worked to get that money, you would have saved it.
- Normalcy bias: The brain sometimes ignore multiple warnings signals. This leads to people not believing or minimizing theat warnings. Eg. People ignoring flood warnings.
- Appeal to probability fallacy: Belief that if its possible, then its probabile. Eg. If I don't bring my umbrella, it will rain for sure.
- Base rate fallacy: When provided with general base rate information and specific local information, people tend to overvalue the specific information - rather than integrating the two. Eg, Consider this problem: Steve is shy - is Steve a librarian or a sales person? Most people assume that Steve is a librarian - but there are way more sales people than librarians - so its more likely that he is a sales person.
- Hofstadter’s law: It always takes longer than you expect, even when you take into account Hofstadter's Law.
- Subadditivity effect: Belief that probability of the whole is lesser than the sum of probabilities of the parts.
- Survivorship bias: Concentrating on the people/things that got past a seletion point. Eg. modeling your life after successful people.
- Zero sum bias: Belief that your situation is a zero sum situation and competing - even when there is unlimited resources. Eg. studying in class - there is no need to compete with fellow students.
- Denomination effect: Less likly to spend a large denotination currency than the equal value in smaller notes.
- Magic number 7+-2: Number of items that can be held in short term memory: 7 +/- 2
- Swimmer’s body illusion: People confuse selection factors with results. Eg. People in entertainment industry did not became attractive with effort - attractiveness is a selection criteria for the industry.
- Money illusion: People mistake the face value of money(the amount of money) with the real value(what it can buy).
- Conservatism: People don't easly change existing belief even when presented with new evidence. They over weight exsiting belief and under weight new evidence.
- Time-saving bias: We underestimate time saved when moving from a relatively smaller speed to higher speed(eg. 30kmph to 40kmph). Also, we overestimate time saved when moving from a relatively faster speed to even faster speed(eg. 70-100kmph).
We think we know what others think
tags: cbias-guess-thoughts
We model the thinking of other people based on our own mind or a much simpler mind than ours.
- Curse of knowledge: When communicating with others, we assume that they have all the background information about the topic that we have already. Eg. when teaching others, the teacher can't understand why its so difficult for the student to get it when its so simple for themselves
- Illusion of transparency: We think other people can understand our mental state fairly accuratly. And we think we are very accurate about our guess of another person's mental state.
- Spotlight effect: We think we are noticed/thought about by others way more than we actually are.
- Illusion of external agency: A belief that good/bad things happen to us because of external influences rather than personal actions. Eg. I got good marks because the teacher likes me.
- Illusion of asymmetric insight: Belief that we know other people than the other person knows us.
- Extrinsic incentive error: We think other people are driven more by extrensic motivators and we are driven more by intrensic motivators.
Current mind state is projected to past and future
tags: cbias-project-mind-state
- Hindsight Bias
- Outcome bias: Evaulating the quality of a decision after the outcome is known.
- Moral luck: Assigning praise or blaim of an action based on outome even if its not fully in their control. Eg. Two people drunk drove - one killed a child. We assign that one more blame even if both did the same action - and the person who killed the child was not in full control of their reactions.
- Declinism: Belief that a socity or instition is becoming worse over time. That the Past is better than the future.
- Telescoping effect: Events in the past or future seems more distant(backward telescoping/time expansion) or nearer(forward telescoping) than they actually are. Events 3 years ago will have forward telescoping - and events closer than 3 years ago will have backward telescoping.
- Rosy retrospection: We judge past events more positively than we do the present. Possibly caused by nostalgia.
- Impact bias: We overestimate duration and intensity of future emotional states. Eg. If we breakup, it will kill me.
- Pessimism bias: We exaggerate the probability of bad things happenning to us. Opposite of Optimism Bias.
- Planning fallacy: We underestimate time required to finish a task. Result of Optimism Bias. This only affects predictions of your own tasks. When predicting for others, we tend to have a pessimism bias.
- Pro-innovation bias: If we see an innovation to work, we tend to belive that it can be applied everywhere without need of alterations. This is especially true if the people who made the innovation are still around and championing the idea.
- Projection bias: We forcast our current preference to a future event. Caused by mental contamination. Eg. If a student gets bad marks and is sad they think they are not going to enjoy a party two weeks in the future.
- Restraint bias: We overestimate our ability to control impulsive behavior. This can lead to more exposure to temptations and impulsivity. Caused by Hot-Cold Empathy Gap.
- Self-consistency bias: We belive that we are consistant in our beliefs, behaviors, options, attitudes, etc. This is a memory bias.
We have to act fast
tags: cbias-act-fast
We evolved with the need to make quick decisions when faced with limited time and information. This programming continues in the present time in form of these thinking flaws.
To act, we should feel important and impactful
tags: cbias-act-fast-important
- Overconfidence effect
- Egocentric bias
- Optimism bias
- Social desirability bias
- Third person effect
- Barnum effect
- Illusion of control
- False consensus effect
- Dunning-Kruger effect
- Hard easy effect
- Illusory superiority
- Self serving bias
- Fundamental attribution error
- Defensive attribution hypothesis
- Trait ascription bias
- Effort justification
- Risk compensation
Favor immediate, known things over distant ones
tags: cbias-favor-immediate
We favor present over future. Stories about specific individuals over anonymous person.
We want to finish things we have invested in
tags: cbias-finish
Helps us to finish things, even with difficult. Actions have inertia - once started its easier to continue.
- Sunk cost fallacy
- Irrational escalation
- Loss aversion
- IKEA effect
- Generation effect
- Zero-risk bias
- Disposition effect
- Unit bias
- Pseudocertainty effect
- Endowment effect
- Backfire effect
We want to have autonomy and status
tags: cbias-autonomy
- System justification
- Reactance
- Reverse psychology
- Social comparison bias
- Status quo bias
- Abilene paradox
- Law of the instrument
- Chesterton's fence
- Hippo Effect
We want to avoid irreversable decisions
tags: cbias-avoid-irreversable
We prefer simple or complete options over complex, ambiguous options
tags: cbias-prefer-simple
- Ambiguity bias
- Information bias
- Belief bias
- Rhyme as reason effect
- Bike shed effect
- Delmore effect
- Conjunction fallacy
- Occam's razor
- Less-is-better effect
- Sapir-Whorf-Korzybski hypothesis
What to remember
tags: cbias-memory
We have to prioritize what te remember and what to discard. We have a set of filters that will help us do this - but it can cause issues too.
We edit memories after the event
tags: cbias-memory-edit
We discard specifics to create generalizations
tags: cbias-memory-general
We reduce events and lists to its key elements
tags: cbias-reduce-to-key
We chose a few items to represent the whole.
- Peak–end rule
- Leveling and sharpening
- Misinformation effect
- Duration neglect
- Serial recall effect
- Modality effect
- Memory inhibition
- Serial position effect
- Suffix effect
We store memory differently based on how the experience was
tags: cbias-memory-experience
Our brain will save things that it thinks is important. Importantce is judged based on the situation - not just the value of the information. Eg. Traumatic memories can be very strong.
- Picture superiority effect
- Levels of processing effect
- Testing effect
- Absent-mindedness
- Next-in-line effect
- Tip of the tongue phenomenon
- Google effect
- Self-relevance effect