Handwrite these notes in your binder. Google or brainstorm examples. Add examples to your notes.
- Anchoring Bias: You rely on the first piece of information you hear, and it affects all your decisions afterward.
- Confirmation Bias: Even if there is no truth to a statement that you hear, you believe the statement (because it conforms to a belief in your unconscious mind). Oftentimes, this bias causes us to only "tune into" facts that reflect our beliefs. We assume we are making rational decisions, but actually, our minds trick us into creating "reasonable" pieces of information ("facts") that support our emotional biases.
- Overconfidence Bias: Individuals tend to believe in their own thoughts too much (they are overconfident that they are always correct or that things will always turn out the way we want). As a result, individuals ignore other people's ideas, even when those ideas may be better. Because of the overconfidence bias, we sometimes do not prepare well enough for a task, and, as a result, do less well than we thought we would. A related bias is Optimism Bias, the belief that everything will always go well. The opposite of the Optimism Bias is the Pessimism Bias, the belief that everything will always go poorly. It is important to find a middle ground in your cognitive process.
- Fundamental Attribution: People tend to blame someone else when something goes wrong, even if the other person did not cause the problem.
- Gambler's Fallacy: People think in an irrational manner that their future could be dictated only by their past.
- Availability Heuristic (Heuristic means an approach to problem-solving or learning.): People rely solely on examples or ideas they already possess, or rely strictly upon examples from their past to make judgments or decisions. They do not take other pieces of information or other examples into consideration.
- The Bandwagon Effect: A person tends to believe or think something just because everyone else believes or thinks it.
- Blind Spot Bias: People are blind to the reality that they possess biases. They think others have biases, but that they do not.
- Conservatism Bias: People tend not to believe that there could be new truth/new facts because they don't like change.
- Information Bias: People sometimes want more and more information, presuming that a lot of information helps their ability to process things and make decisions. Sometimes too much information clouds judgment, giving the impression that there are more angles to something when 2 or 3 angles are already enough.
- The Ostrich Effect: Like an ostrich, people "bury their heads in the sand," brushing off unpleasant information or changes because the facts make them uncomfortable. They think problems will disappear if they ignore them.
- Outcome Bias: People make decisions based on what they think they know and based on thinking that they will get the same outcome each time, not realizing that life is often changing (similar to biases 5 and 6 above).
- Pro-Innovation Bias: People believe that everyone should adapt to new technologies, even when they aren't especially great for everyone's life. Sometimes the innovations aren't even good. The bias is that something new or improved means it is always the best.
- Choice-Supportive Bias: People stick to a choice they made, even when they realize the choice is not good for them. Sometimes the bias results because people are afraid of doing what they want, or they are scared of being ridiculed.
- The Placebo Effect: A common example is when doctors give people different sets of pills--one is an actual medicine and the other is a sugar pill. Because people believe they are taking the medicine, they become better. The power of the mind is strong. Just because someone believes something is good doesn't mean it is. A person could be unconsciously convincing herself that a product is effective, even when it is not.
- The Cheerleader Effect: People think someone in a group is beautiful or great simply because they believe the group as a whole is excellent. Someone is "great by association." When the group breaks up, people see the individuals as they really are; good and bad qualities become apparent.
- Curse of Knowledge: Sometimes people know more than others but those with knowledge should understand that there are people who have different backgrounds, different abilities, and different intellectual capacity. These individuals take things at face value, and they don't research because they believe it is a waste of time. People believe what they want to believe and sometimes all the knowledge you are willing to impart is not wanted, and if considered by someone else, would not make a difference.
- Salience Bias: Salience is about focusing on the most noticeable thing about someone or something, even when it is the opposite of the truth. People tend to judge others, sometimes irrationally.
- Illusory Superiority: Often, unskilled individuals (because of their ignorance) are over-confident, as opposed to those who are really intelligent and/or skilled. Sometimes intelligent people put too much pressure on themselves and are their own worst critics, whereas less intelligent people may think they know it all and have no fear.
- Survivorship Bias: People think they can do/survive anything and believe everything will just work out. They don't bother to prepare or take appropriate information into consideration. For example, they open up a business without doing the proper research. The business fails. Individuals need to consider the positives and negatives of any adventure or endeavor.
- Money Illusion: Sometimes when people get offered money as an incentive to take on a new job, for example, they "jump on" the opportunity without thinking through whether the money is really worth taking on the new position, or they may not bother to calculate if the amount offered is worth it in the long run. "Free money" can be very alluring. Learn to stop and reflect before accepting money as an incentive.
- Reactance Bias: (The psychological term "reactance" means an unpleasant feeling when confronted with certain people, rules, or regulations that might threaten or eliminate a perceived freedom.) People feel that someone or something is taking away their choices or limiting them in some way. Individuals may refuse to do something simply because others are suggesting or telling them that the action would be good for them. They may feel pressured and, as a result, choose to irrationally rebel without thoughtfully considering the advice.
- Rhyme as Reason Effect: When people hear rhyming words, they are more likely to believe that statement than a phrase or sentence that does not rhyme, regardless of the truth of the rhyming statement. People tend to more easily believe words and phrases that sound pleasant. Examples: Birds of a feather flock together; If it doesn't fit, you must acquit; Build a wall and crime will fall!
- Status Quo Bias: People tend to prefer that things and people remain the same, no matter how much time has passed or what events have transpired. The truth is that people, things, and situations can and do change.
- Illusion of Control: "If only I had done . . . then this wouldn't have happened." The truth is that a lot of things happen, and most of the time you have no control over outcomes. People want to feel in control because they feel safer, but much of what happens in life is chaotic, random, and simply, out of a person's control.
(Adapted and abridged--Source: The 25 Cognitive Biases: Understanding Human Psychology, Decision Making & How To Not Fall Victim to Them by Kai Musashi.)
To learn about additional biases, check out these links: https://www.businessinsider.com/cognitive-biases-that-affect-decisions-2015-8; https://www.businessinsider.com/cognitive-biases-2015-10; https://en.wikipedia.org/wiki/Cognitive_bias; https://scholarmulhern.blogspot.com/2018/09/articles-on-cognitive-bias.html
Also see post on Logical Fallacies: https://scholarmulhern.blogspot.com/2015/04/ap-english-links-with-information-about.html
Always check the facts: https://scholarmulhern.blogspot.com/p/check-facts-you-hear-in-media.html