Files
2025-11-30 08:38:26 +08:00

375 lines
15 KiB
Markdown

# Cognitive Bias Catalog
## Quick Reference Table
| Bias | Category | Impact | Detection | Remediation |
|------|----------|--------|-----------|-------------|
| Confirmation | Confirmation | Seek supporting evidence only | Search for disconfirming evidence? | Red team your forecast |
| Desirability | Confirmation | Want outcome → believe it's likely | Do I want this outcome? | Outsource to neutral party |
| Availability | Availability | Recent/vivid events dominate | What recent news influenced me? | Look up actual statistics |
| Recency | Availability | Overweight recent data | Considering full history? | Expand time window |
| Anchoring | Anchoring | First number sticks | Too close to initial number? | Generate estimate first |
| Affect | Affect | Feelings override data | How do I feel about this? | Acknowledge, then set aside |
| Loss Aversion | Affect | Overweight downside | Weighting losses more? | Evaluate symmetrically |
| Overconfidence | Overconfidence | Intervals too narrow | Track calibration | Widen intervals to 20-80% |
| Dunning-Kruger | Overconfidence | Novices overestimate | How experienced am I? | Seek expert feedback |
| Optimism | Overconfidence | "Won't happen to me" | What's the base rate? | Apply base rate to self |
| Pessimism | Overconfidence | Overweight negatives | Only considering downsides? | List positive scenarios |
| Attribution Error | Attribution | Blame person, not situation | What situational factors? | Consider constraints first |
| Self-Serving | Attribution | Success=skill, failure=luck | Consistent attribution? | Same standard for both |
| Framing | Framing | Presentation changes answer | How is this framed? | Rephrase multiple ways |
| Narrative Fallacy | Framing | Simple stories mislead | Story too clean? | Prefer stats over stories |
| Sunk Cost | Temporal | Can't abandon past investment | Only future costs/benefits? | Decide as if starting fresh |
| Hindsight | Temporal | "Knew it all along" | Written record of prediction? | Record forecasts beforehand |
| Planning Fallacy | Temporal | Underestimate time/cost | Reference class timeline? | Add 2-3x buffer |
| Outcome Bias | Temporal | Judge by result not process | Evaluating process or outcome? | Judge by info available then |
| Clustering Illusion | Pattern | See patterns in randomness | Statistically significant? | Test significance |
| Gambler's Fallacy | Pattern | Expect short-term balancing | Are events independent? | Use actual probability |
| Base Rate Neglect | Bayesian | Ignore prior probabilities | Did I start with base rate? | Always start with base rate |
| Conjunction Fallacy | Bayesian | Specific > general | Is A&B > A alone? | P(A&B) ≤ P(A) always |
| Halo Effect | Social | One trait colors everything | Generalizing from one trait? | Assess dimensions separately |
| Authority Bias | Social | Overweight expert opinions | Expert's track record? | Evaluate evidence not credentials |
| Peak-End | Memory | Remember peaks/endings only | Remembering whole sequence? | Review full historical record |
---
## Confirmation Cluster
### Confirmation Bias
**Definition:** Search for, interpret, and recall information that confirms pre-existing beliefs.
**Affects forecasting:** Only look for supporting evidence, discount contradictions, selective memory.
**Detect:** Did I search for disconfirming evidence? Can I steelman the opposite view?
**Remediate:** Red team your forecast, list disconfirming evidence first, ask "How could I be wrong?"
### Desirability Bias
**Definition:** Believing outcomes you want are more likely than they are.
**Affects forecasting:** Bullish on own startup, wishful thinking masquerading as analysis.
**Detect:** Do I want this outcome? Am I emotionally invested?
**Remediate:** Outsource to neutral party, imagine opposite outcome, forecast before declaring preference.
---
## Availability Cluster
### Availability Heuristic
**Definition:** Judging probability by how easily examples come to mind.
**Affects forecasting:** Overestimate vivid risks (terrorism), underestimate mundane (heart disease), media coverage distorts frequency perception.
**Detect:** What recent news am I thinking of? Is this vivid/emotional/recent?
**Remediate:** Look up actual statistics, use reference class not memorable examples.
### Recency Bias
**Definition:** Overweighting recent events relative to historical patterns.
**Affects forecasting:** Extrapolate recent trends linearly, forget cycles and mean reversion.
**Detect:** How much history am I considering? Is forecast just recent trend?
**Remediate:** Expand time window (decades not months), check for cyclicality.
---
## Anchoring Cluster
### Anchoring Bias
**Definition:** Over-relying on first piece of information encountered.
**Affects forecasting:** First number becomes estimate, can't adjust sufficiently from anchor.
**Detect:** What was first number I heard? Am I too close to it?
**Remediate:** Generate own estimate first, use multiple independent sources.
### Priming
**Definition:** Prior stimulus influences subsequent response.
**Affects forecasting:** Reading disaster primes pessimism, context shapes judgment unconsciously.
**Detect:** What did I just read/see/hear? Is mood affecting forecast?
**Remediate:** Clear mind before forecasting, wait between exposure and estimation.
---
## Affect Cluster
### Affect Heuristic
**Definition:** Letting feelings about something determine beliefs about it.
**Affects forecasting:** Like it → think it's safe, dislike it → think it's dangerous.
**Detect:** How do I feel about this? Would I forecast differently if neutral?
**Remediate:** Acknowledge emotion then set aside, focus on base rates and evidence.
### Loss Aversion
**Definition:** Losses hurt more than equivalent gains feel good (2:1 ratio).
**Affects forecasting:** Overweight downside scenarios, status quo bias, asymmetric risk evaluation.
**Detect:** Am I weighting losses more? Would I accept bet if gains/losses swapped?
**Remediate:** Evaluate gains and losses symmetrically, use expected value calculation.
---
## Overconfidence Cluster
### Overconfidence Bias
**Definition:** Confidence exceeds actual accuracy.
**Affects forecasting:** 90% intervals capture truth 50% of time, narrow ranges, extreme probabilities.
**Detect:** Track calibration, are intervals too narrow? Can I be surprised?
**Remediate:** Widen confidence intervals, track calibration, use 20-80% as default.
### Dunning-Kruger Effect
**Definition:** Unskilled overestimate competence; experts underestimate.
**Affects forecasting:** Novices predict with false precision, don't know what they don't know.
**Detect:** How experienced am I in this domain? Do experts agree?
**Remediate:** If novice widen intervals, seek expert feedback, learn domain deeply first.
### Optimism Bias
**Definition:** Believing you're less likely than others to experience negatives.
**Affects forecasting:** "My startup is different" (90% fail), "This time is different" (rarely is).
**Detect:** What's base rate for people like me? Am I assuming I'm special?
**Remediate:** Use reference class for yourself, apply base rates, assume average then adjust slightly.
### Pessimism Bias
**Definition:** Overweighting negative outcomes, underweighting positive.
**Affects forecasting:** Disaster predictions rarely materialize, underestimate human adaptability.
**Detect:** Only considering downsides? What positive scenarios missing?
**Remediate:** Explicitly list positive scenarios, consider adaptive responses.
---
## Attribution Cluster
### Fundamental Attribution Error
**Definition:** Overattribute behavior to personality, underattribute to situation.
**Affects forecasting:** "CEO is brilliant" ignores market conditions, predict based on person not circumstances.
**Detect:** What situational factors am I ignoring? How much is luck vs. skill?
**Remediate:** Consider situational constraints first, estimate luck vs. skill proportion.
### Self-Serving Bias
**Definition:** Attribute success to skill, failure to bad luck.
**Affects forecasting:** Can't learn from mistakes (was luck!), overconfident after wins (was skill!).
**Detect:** Would I explain someone else's outcome this way? Do I attribute consistently?
**Remediate:** Apply same standard to wins and losses, assume 50% luck/50% skill, focus on process.
---
## Framing Cluster
### Framing Effect
**Definition:** Same information, different presentation, different decision.
**Affects forecasting:** "90% survival" vs "10% death" changes estimate, format matters.
**Detect:** How is question framed? Do I get same answer both ways?
**Remediate:** Rephrase multiple ways, convert to neutral format, use frequency (100 out of 1000).
### Narrative Fallacy
**Definition:** Constructing simple stories to explain complex reality.
**Affects forecasting:** Post-hoc explanations feel compelling, smooth narratives overpower messy data.
**Detect:** Is story too clean? Can I fit multiple narratives to same data?
**Remediate:** Prefer statistics over stories, generate alternative narratives, use base rates.
---
## Temporal Biases
### Sunk Cost Fallacy
**Definition:** Continuing endeavor because of past investment, not future value.
**Affects forecasting:** "Invested $1M, can't stop now", hold losing positions too long.
**Detect:** If I started today, would I choose this? Considering only future costs/benefits?
**Remediate:** Consider only forward-looking value, treat sunk costs as irrelevant.
### Hindsight Bias
**Definition:** After outcome known, "I knew it all along."
**Affects forecasting:** Can't recall prior uncertainty, overestimate predictability, can't learn from surprises.
**Detect:** What did I actually predict beforehand? Written record exists?
**Remediate:** Write forecasts before outcome, record confidence levels, review predictions regularly.
### Planning Fallacy
**Definition:** Underestimate time, costs, risks; overestimate benefits.
**Affects forecasting:** Projects take 2-3x longer than planned, inside view ignores reference class.
**Detect:** How long did similar projects take? Using inside view only?
**Remediate:** Use reference class forecasting, add 2-3x buffer, consider outside view first.
### Outcome Bias
**Definition:** Judging decision quality by result, not by information available at time.
**Affects forecasting:** Good outcome ≠ good decision, can't separate luck from skill.
**Detect:** What did I know when I decided? Evaluating process or outcome?
**Remediate:** Judge decisions by process not results, evaluate with info available then.
---
## Pattern Recognition Biases
### Clustering Illusion
**Definition:** Seeing patterns in random data.
**Affects forecasting:** "Winning streak" in random sequence, stock "trends" that are noise, "hot hand" fallacy.
**Detect:** Is this statistically significant? Could this be random chance?
**Remediate:** Test statistical significance, use appropriate sample size, consider null hypothesis.
### Gambler's Fallacy
**Definition:** Believing random events "balance out" in short run.
**Affects forecasting:** "Due for a win" after losses, expecting mean reversion too quickly.
**Detect:** Are these events independent? Does past affect future probability?
**Remediate:** Recognize independent events, don't expect short-term balancing.
---
## Bayesian Reasoning Failures
### Base Rate Neglect
**Definition:** Ignoring prior probabilities, focusing only on new evidence.
**Affects forecasting:** "Test is 90% accurate" ignores base rate, vivid case study overrides statistics.
**Detect:** What's the base rate? Did I start with prior probability?
**Remediate:** Always start with base rate, update incrementally with evidence.
### Conjunction Fallacy
**Definition:** Believing specific scenario is more probable than general one.
**Affects forecasting:** "Librarian who likes poetry" > "Librarian", detailed scenarios feel more likely.
**Detect:** Is A&B more likely than A alone? Confusing plausibility with probability?
**Remediate:** Remember P(A&B) ≤ P(A), strip away narrative details.
---
## Social Biases
### Halo Effect
**Definition:** One positive trait colors perception of everything else.
**Affects forecasting:** Successful CEO → good at everything, one win → forecaster must be skilled.
**Detect:** Am I generalizing from one trait? Are dimensions actually correlated?
**Remediate:** Assess dimensions separately, don't assume correlation, judge each forecast independently.
### Authority Bias
**Definition:** Overweight opinions of authorities, underweight evidence.
**Affects forecasting:** "Expert said so" → must be true, defer to credentials over data.
**Detect:** What's expert's track record? Does evidence support claim?
**Remediate:** Evaluate expert track record, consider evidence not just credentials.
---
## Memory Biases
### Peak-End Rule
**Definition:** Judging experience by peak and end, ignoring duration and average.
**Affects forecasting:** Remember market peak, ignore average returns, distorted recall of sequences.
**Detect:** Am I remembering whole sequence? What was average not just peak/end?
**Remediate:** Review full historical record, calculate averages not memorable moments.
### Rosy Retrospection
**Definition:** Remembering past as better than it was.
**Affects forecasting:** "Things were better in old days", underestimate historical problems.
**Detect:** What do contemporary records show? Am I romanticizing the past?
**Remediate:** Consult historical data not memory, read contemporary accounts.
---
## Application to Forecasting
### Pre-Forecast Checklist
1. What's the base rate? (Base rate neglect)
2. Am I anchored on a number? (Anchoring)
3. Do I want this outcome? (Desirability bias)
4. What recent events am I recalling? (Availability)
5. Am I overconfident? (Overconfidence)
### During Forecast
1. Did I search for disconfirming evidence? (Confirmation)
2. Am I using inside or outside view? (Planning fallacy)
3. Is this pattern real or random? (Clustering illusion)
4. Am I framing this question neutrally? (Framing)
5. What would change my mind? (Motivated reasoning)
### Post-Forecast Review
1. Record what I predicted before (Hindsight bias)
2. Judge decision by process, not outcome (Outcome bias)
3. Attribute success/failure consistently (Self-serving bias)
4. Update calibration tracking (Overconfidence)
5. What did I learn? (Growth mindset)
---
## Bias Remediation Framework
**Five principles:**
1. **Awareness:** Know which biases affect you most
2. **Process:** Use checklists and frameworks
3. **Calibration:** Track accuracy over time
4. **Humility:** Assume you're biased, not immune
5. **Updating:** Learn from mistakes, adjust process
**Key insight:** You can't eliminate biases, but you can design systems that compensate for them.
---
**Return to:** [Main Skill](../SKILL.md#interactive-menu)