Introduction
Most of us like to believe that we make decisions based on logic, evidence, and careful thought. Whether we are choosing a career path, voting in an election, buying a product, or deciding how to respond in a relationship, we assume that our reasoning is deliberate and rational. Psychology, however, tells a very different story. Human decision-making is deeply influenced by cognitive biases—systematic patterns of thinking that deviate from objective logic (Kahneman, 2011).

Cognitive biases are not flaws in intelligence; they are shortcuts the brain uses to conserve energy. In a world filled with overwhelming information, these mental shortcuts help us make quick judgments. Unfortunately, they can also distort our perception of reality, leading us to make decisions that are inconsistent, irrational, or even harmful. Many of these biases operate below conscious awareness, quietly shaping our beliefs and choices.
Read More: Sleep and Mental Health
1. Anchoring Bias
Anchoring bias occurs when people rely too heavily on the first piece of information they encounter when making a decision (Tversky & Kahneman, 1974). This initial information—known as the “anchor”—sets a reference point that influences subsequent judgments. For example, if you see a jacket originally priced at ₹10,000 but discounted to ₹6,000, the original price acts as an anchor. The discounted price feels like a bargain, even if ₹6,000 is still expensive. Anchoring influences salary negotiations, shopping behavior, legal judgments, and even medical decisions.
Once an anchor is established, adjusting away from it requires conscious effort, which many people fail to apply fully.
2. Confirmation Bias
Confirmation bias is the tendency to seek out, interpret, and remember information that supports existing beliefs while ignoring or dismissing contradictory evidence (Nickerson, 1998). In everyday life, confirmation bias shows up when people consume news that aligns with their political views, follow social media accounts that reinforce their opinions, or interpret ambiguous situations in ways that confirm what they already believe. This bias plays a major role in polarization, conspiracy thinking, and resistance to change.
Rather than evaluating evidence objectively, the mind becomes a defense lawyer for its own beliefs.
3. Overconfidence Bias
Overconfidence bias refers to the tendency for individuals to overestimate their abilities, knowledge, or accuracy of judgments (Moore & Healy, 2008). Research consistently shows that people rate themselves as above average in skills such as driving, leadership, and intelligence. This bias can be dangerous. Overconfidence contributes to poor financial decisions, inadequate preparation, and underestimation of risks. In professional settings, it can lead to flawed strategies and resistance to feedback.

Ironically, those with the least competence are often the most confident—a phenomenon closely related to the Dunning-Kruger effect.
4. Availability Heuristic
The availability heuristic is a mental shortcut where people estimate the likelihood of events based on how easily examples come to mind (Tversky & Kahneman, 1973). For instance, after seeing frequent media coverage of airplane crashes, people may believe flying is more dangerous than driving, despite statistical evidence showing the opposite. Dramatic, emotional, or recent events are easier to recall, making them seem more common than they actually are.
This bias heavily influences risk perception, fear, and public opinion.
5. Framing Effect
The framing effect occurs when different presentations of the same information lead to different decisions (Kahneman & Tversky, 1984). A medical treatment described as having a “90% survival rate” is perceived more positively than one described as having a “10% mortality rate,” even though both statements convey identical information. The way choices are framed affects judgments in healthcare, marketing, politics, and personal decision-making.
Framing highlights how perception, not facts alone, shapes decisions.
6. Status Quo Bias
Status quo bias is the preference for maintaining the current state of affairs, even when better alternatives exist (Samuelson & Zeckhauser, 1988). People often stick with familiar routines, jobs, or systems simply because change feels uncomfortable or risky. This bias explains why individuals stay in unfulfilling relationships, resist organizational change, or continue using outdated technology.

The desire for stability can override rational evaluation of potential benefits.
7. Sunk Cost Fallacy
The sunk cost fallacy occurs when people continue investing time, money, or effort into something because of what they have already invested, rather than evaluating current and future outcomes (Arkes & Blumer, 1985). Examples include staying in a failing business because of previous expenses or continuing to watch a movie you dislike because you already paid for the ticket. Rational decision-making requires ignoring sunk costs, but emotionally, letting go feels like admitting failure.
This bias keeps people trapped in unproductive commitments.
8. Bandwagon Effect
The bandwagon effect describes the tendency to adopt beliefs or behaviors simply because many others are doing so (Leibenstein, 1950). Social trends, viral challenges, fashion fads, and political movements often spread through this effect. People assume that popularity equals correctness or value. Social belonging and fear of exclusion amplify this bias.
While conformity can foster social cohesion, it can also suppress independent thinking.
9. Loss Aversion
Loss aversion refers to the finding that losses feel more painful than equivalent gains feel pleasurable (Kahneman & Tversky, 1979). Losing ₹1,000 feels significantly worse than the pleasure of gaining ₹1,000. This bias explains why people avoid risks, hold onto losing investments, and resist change even when it could lead to improvement.
Loss aversion has major implications for finance, relationships, and personal growth.
10. Halo Effect
The halo effect occurs when a positive impression in one area influences perceptions in unrelated areas (Thorndike, 1920). Attractive people are often assumed to be more intelligent or competent. Well-spoken individuals are perceived as more knowledgeable. This bias affects hiring decisions, branding, education, and interpersonal judgments.
First impressions can overshadow objective evaluation.
Conclusion
Cognitive biases are an unavoidable part of human cognition. They simplify decision-making in a complex world but often lead to predictable errors. Becoming aware of these biases does not eliminate them, but it allows us to slow down, question assumptions, and seek more balanced perspectives.
Better decisions begin not with intelligence alone, but with humility and self-awareness.
References
Arkes, H. R., & Blumer, C. (1985). The psychology of sunk cost. Organizational Behavior and Human Decision Processes, 35(1), 124–140.
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Kahneman, D., & Tversky, A. (1979). Prospect theory: An analysis of decision under risk. Econometrica, 47(2), 263–291.
Kahneman, D., & Tversky, A. (1984). Choices, values, and frames. American Psychologist, 39(4), 341–350.
Leibenstein, H. (1950). Bandwagon, snob, and Veblen effects. Quarterly Journal of Economics, 64(2), 183–207.
Nickerson, R. S. (1998). Confirmation bias: A ubiquitous phenomenon. Review of General Psychology, 2(2), 175–220.
Samuelson, W., & Zeckhauser, R. (1988). Status quo bias in decision making. Journal of Risk and Uncertainty, 1(1), 7–59.
Thorndike, E. L. (1920). A constant error in psychological ratings. Journal of Applied Psychology, 4(1), 25–29.
Tversky, A., & Kahneman, D. (1973). Availability: A heuristic for judging frequency and probability. Cognitive Psychology, 5(2), 207–232.
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131.
Subscribe to PsychUniverse
Get the latest updates and insights.
Join 3,039 other subscribers!
Niwlikar, B. A. (2026, January 29). 10 Important Cognitive Biases That Quietly Control Your Decisions. PsychUniverse. https://psychuniverse.com/cognitive-biases/



