Tuesday, January 9, 2018

Cognitive Biases

(from the Pendleton Book Blather Facebook group)

...and, in case you are a late comer and have missed the point:
WHY cognitive biases? Why are they interesting?
IM(not so)HO, this is the single most important area of sociology after demographics, because it studies the decision making of individuals and groups. It is practical!
Hard sciences (math, physics, chemistry, etc.): Good grounding in the cognitive sciences makes it easier for scientists, particularly physicists, to construct models (both for themselves and for us) that are less likely to be anthopocentric constructs and hopefully more likely to actually describe physical reality. If that is possible. Lisa Randall, Steven Hawking, and the late Richard Feynmann are/were particularly good at this.
Economics particularly suffers from one particular flaw: EVERYBODY lies about money. And cognitive science can help disperse the smoke. Paul Krugman (http://krugman.blogs.nytimes.com/) continually refers, in his wonkier postings, to the problems economists run into when they try to model the decision making process of different 'actors' (consumers, stock market players/brokers/exchanges, governments).
Medical science, particularly diagnostic and preventive medicine: A real minefield, here....people make MAJOR decisions about their own health, throughout life, and are almost never capable of objective judgement in that regard. And because of our extremely screwed up health care system, medical professionals and institutions also suffer from built in biases, as the result of ambiguous mandates and many disincentives to actual provide appropriate care.
I could go on, and on, and on....

History! Gah! Helps us understand and correct for the biases of individual and group decision making by both historians and historical actors. Extremely important if you are to comprehend context and compare texts and accounts. After all, they always complain that history is just a long, boring litany of 'kings'...and what do kings do, but make decisions all day long?



EXPERIMENTER'S or EXPECTATION BIAS: The tendency for experimenters to believe, certify, and publish data that agree with their expectations for the outcome of an experiment, and to disbelieve, discard, or downgrade the corresponding weightings for data that appear to conflict with those expectations.
Cold Fusion. The Piltdown Man. Differences in intelligence based on race or gender. Statin drugs. The Tonkin Gulf Incident.
They tell themselves: "Why do all this work, waste all this money, invest all this emotional capital...and find out we are wrong?" Even if they are honest, they are going to unconsciously EXPECT results to be different.
Negative findings are seldom published; after all, the experimenter has just wasted his time! He would rather toss his data in the trash, design a NEW experiment, and hope that THAT turns out the way he expected.

EXAGGERATED EXPECTATION Based on the estimates, real-world evidence turns out to be less extreme than our expectations (conditionally inverse of the conservatism bias).
Translated....the universe is more boring than we give it credit for....sure, we are surprised (the conservative bias) when we flip six heads in a row...but that doesn't happen often. Our poor little memories are associative, and the extremes and landmarks and aberrations are the anchors for it. We forget the boring stretch of straight road between the curves...
This is another case (sort of the opposite of conservative bias), where we have a difficult time understand the true meaning of 'random'. Probability is NOT intuitive.

ENDOWMENT EFFECT: The fact that people often demand much more to give up an object than they would be willing to pay to acquire it.
Essentially, psychological researchers have proved that the perceived value of an object almost IMMEDIATELY doubles (at least) once the person has purchased or acquired it. Bizarre and profoundly strange? I am always wondering if this is the true root cause of acquisitiveness, the inability of many greedy people to relinquish something once they've gained possession of it, legally or not.
Yes, you had really better read the link on this particular one. It has profound effects on human economic activity, to the point where it is an absolutely essential field of study for marketers.

EMPATHY GAP: The tendency to underestimate the influence or strength of feelings, in either oneself or others.
------------------------
This is where we make a decision based on emotional factors ('I wanted to hit that guy.') and then later rationalize it ('I wanted to hit him because looked at me funny.').
By the same token, you are likely to attribute to evil what may merely be bad anger management (using the same situation in reverse).
So...honesty (to yourself and through empathy) is always the best politcy here. Delusion starts by denying that you and other people lack feelings.

CURSE OF KNOWLEDGE: When better-informed people find it extremely difficult to think about problems from the perspective of lesser-informed people.
----------------
So this should make anyone who has been accused of being 'elitist' feel a little better. This is also one piece of the Dunning-Kruger syndrome.
As a science and computer geek, I find this 'curse' fairly easy to comprehend. I have spent 40 years overdosing on science, and almost 35 years learning about computers. And I have spent at least that long trying to explain WHY these things work the way they do.
And...I do have a solution to the 'curse': I revert back mentally to the time when I was first learning the thing the less informed person is stuck on, and pretend I am in the same state of ignorance. Believe it or not, this has helped me greatly with my customers, when I bother to slow down and try it.

DISTINCTION BIAS: The tendency to view two options as more dissimilar when evaluating them simultaneously than when evaluating them separately.
-- and --
CONTRAST EFFECT: The enhancement or reduction of a certain perception's stimuli when compared with a recently observed, contrasting object.
--------------------------------
The first can also be called the 'less-is-better effect'; we tend to notice more defining details when comparing two similar objects side-by-side; we are more likely to judge something fairly if we isolate it from other things we might want to compare it against.
The second one is very similar; the way to avoid it is to realize that proximity in TIME can also effect one's perception. Ferinstance, you are more likely to think a person is attractive if you have just been shown a picture of someone that is attractive.
These two just underline the fact that our brains are NOT logic engines; we are NOT good at filtering out the emotional and cooincidentially irrelevant details of life to arrive at the basic data.

DENOMINATION EFFECT: The tendency to spend more money when it is denominated in small amounts (e.g. coins) rather than large amounts (e.g. bills).
---------------
I am guessing that most people are reluctant to break large bills just to buy something small....or possibly there is a certain amount of superstitious awe involved in 'disrupting' round numbers?
Question: If you were given exactly $1000 as a windfall, and did not have any immediate needs...how likely are you to spend it on something trivial like a large TV? Or would you feel 'better' saving for later, paying rent or buying food with it?
If, on the other hand, you get a tax return for $738.25, would you be more likely to spend it immediately on non-essentials?
I think this bias might be related to the well-known marketing tactic of (ferinstance) selling a product for $39.95 rather than $40. The 'random' string of digits in the former are less intimidating that the round number of the latter.

DECOY EFFECT: Preferences for either option A or B changes in favor of option B when option C is presented, which is similar to option B but in no way better.
-----------------------
Muddying the waters, essentially. A common marketing tactic that takes advantage of this bias involves offering a third alternative that is more expensive BUT has fewer positive features. This makes the more expensive of the original two choices look more appealing by contrast.

CONSERVATIVE/REGRESSIVE BIAS: A certain state of mind wherein high values and high likelihoods are overestimated while low values and low likelihoods are underestimated.
-- and/or --
BAYESIAN CONSERVATISM: The tendency to revise one's belief insufficiently when presented with new evidence.
-------------
Eek. Very closely correlated these are.
The first results in the tendency to ignore 'data' that is in the minority (to use some rather slanted terms, heheh)....one could say that this type of conservatism defines the natural anti-democratic strain in human nature. This is a much less loaded explanation that avoids referring to such tendencies as 'narcissism', 'egotism', and 'greed'. Smirk.
The second can be mistaken for plain old pigheadedness, but....in both of these cases, we are talking about biases here, not necessarily fully conscious choices. In other words..like most of this list, we REALLY REALLY have to work hard to fight against these things, dredge them up and examine them consciously rather than let them quietly build delusions for you.

CONJUNCTION FALLACY: The tendency to assume that specific conditions are more probable than general ones.
An example spells this out best:
"Linda is 31 years old, single, outspoken, and very bright. She majored in philosophy. As a student, she was deeply concerned with issues of discrimination and social justice, and also participated in anti-nuclear demonstrations.
Which is more probable?
A. Linda is a bank teller.
B. Linda is a bank teller and is active in the feminist movement."
Answer is A, but the probabilistically naive vastly favor B. Whew. A combination of attributes is ALWAYS less likely than each attribute separately.
------------------------
This is a hard one, so give this a serious think. Try to ignore the intentional political/social smoke screen this question raises in your face (which leverages another common bias) and get to the meat of the thing...
This fallacy can be taken advantage of in a number of subtle ways, including suckering people into accepting short-cuts in arguments that are not logically valid, or are inconsistent with reality.
Conspiracy buffs are great practitioners:
Which is more probable?
A. Traffic cameras are used to enforce traffic laws.
B. Traffic cameras are used to enforce traffic laws and spy on private citizens.
Hmmm?

CONGRUENCE BIAS: The tendency to test hypotheses exclusively through direct testing, instead of testing possible alternative hypotheses.
Very, very common with 'purchased research' that is unlikely to be subjected to heavy peer review. Basically, you set up an experiment (or test a product or idea) only to verify that the positive result is valid (e.g. push the button, the door opens), without testing whether an alternative might also yield an interesting results (e.g. hit another button, or no button, and the door still opens).
Drug trials are major benefactor of this bias, esp. since the placebo effect can cause a false correlation between the effectiveness of a drug and the (possibly ephemeral) alleviation of the illness.

CLUSTERING ILLUSION: The tendency to overestimate the importance of small runs, streaks, or clusters in large samples of random data (that is, seeing phantom patterns).
Yup. Another illustration of how poorly equipped we are to handle the world as it is.....and why the gambling industry exists.

CHEERLEADER EFFECT: The tendency for people to appear more attractive in a group than in isolation.
Hmmmm. Not sure how one would go about avoiding this one. This is a probably a combination of vanity and a desire to be part of a group...?

CHOICE-SUPPORTIVE BIAS: The tendency to remember one's choices as better than they actually were.
There are many related biases going on here as well, including hindsight bias, selective memory, and failure to remember neutral or negative outcomes...remembering things accurately is hard or impossible, and unless you are simply trying to become less deluded, it is probably pointless to worry about what has already happened.
So the takeaway from this should probably be: Don't depend on your brilliant judgement for the next big decision you make on this subject...you may have been lucky, or you may have made the right decision for the wrong reason. Certainly don't expect to have learned everything already! Get all the data you can, every time..if you have time.

BIAS BLIND SPOT: The tendency to see oneself as less biased than other people, or to be able to identify more cognitive biases in others than in oneself.
This is the main (valid) complaint against the 'liberal elite'. Ahem.

BELIEF BIAS: An effect where someone's evaluation of the logical strength of an argument is biased by the believability of the conclusion.
Captain Obvious strikes again! Dishonesty starts at home....

BASE RATE FALLACY: The tendency to ignore base rate information (generic, general information) and focus on specific information (information only pertaining to a certain case).
Example:
A group of policemen have breathalyzers displaying false drunkenness in 5% of the cases tested. However, the breathalyzers never fail to detect a truly drunk person. 1/1000 of drivers are driving drunk. Suppose the policemen then stops a driver at random, and force them to take a breathalyzer test. It indicates that he or she is drunk. We assume you don't know anything else about him or her. How high is the probability he or she really is drunk?
Many would answer as high as 95%, but the correct probability is about 2%.....think about it: out of 1000 drivers, 50 (5%) will fail a breathalyzer. Only one of those thousand is probably drunk, so it is at least FORTY-NINE TIMES MORE LIKELY that the policemen will 'fail' a sober person (assuming the drunk is caught and tested).
This is a tricky bias to get the head around, but very, very common. People are terrible at estimating probabilities, much less comparing them. And as you might guess in the example above, people will tend to fudge their estimates in the direction they expect them to go....

BANDWAGON EFFECT: The tendency to do (or believe) things because many other people do (or believe) the same. Related to groupthink and herd behavior.
Well...this one SHOULD be obvious. But amazing how often people forget it. A little bit of solitary mulling-over is all important.

BACKFIRE EFFECT: When people react to disconfirming evidence by strengthening their beliefs.
Another one that seems obvious...until you find yourself doing it. Again...question your fundamentals whenever you think you can stand the pain of it.

AVAILABILITY CASCADE: A self-reinforcing process in which a collective belief gains more and more plausibility through its increasing repetition in public discourse (or "repeat something long enough and it will become true").
I could hardly wait for this one. Merciful Jeebus. Do I really have to give any examples for this, or explain how incredibly damaging this kind of thing is? Let's have fun with this one, campers! Name your favorite item of topical, shrilly repeated gibberish.

ATTENTIONAL BIAS: The tendency of our perception to be affected by our recurring thoughts.
AVAILABILITY HEURISTIC: The tendency to overestimate the likelihood of events with greater "availability" in memory, which can be influenced by how recent the memories are or how unusual or emotionally charged they may be.
Slightly different causes, same general effect, here. PTSD would be a very good example, probably a mix of both tendencies.
'You are what you have been.' Intentional cultivation of empathy, perhaps by immersing oneself in the thought processes of different or even opposing points of view, can be of help here. Sink yourself into someone you are not....

ANCHORING: The tendency to rely too heavily, or "anchor," on one trait or piece of information when making decisions (usually the first piece of information that we acquire on that subject).
Again...don't get lazy about the information you use (do you even know where you got it?).....find more than one source for your information, and try to use good judgement about the value of each source.  This is one of the most common traits of TeaBaggers: they fixate on one issue and dogmatically defend their narrow view of that issue to the exclusion of others; even ones they share with their fellow circle-the-drainers...


AMBIGUITY EFFECT: The tendency to avoid options for which missing information makes the probability seem 'unknown'.
For example, people will be slow to adopt new methods if they lack information about the outcome. Ignorance is bliss.



No comments:

Post a Comment

Note: Only a member of this blog may post a comment.