We just discussed what is known as the ‘Correspondence Bias.’ Something that overlaps with this bias is the Fundamental Attribution Error, which is the reinforced tendency to draw conclusions by overestimating the Dispositional factors even when a logical analysis suggests such conclusions should not be drawn. An example of that is the supporting studies known as ‘Ross et al.,1977‘ and the ‘Jones and Harris Castro essay‘ evaluation. In both studies, the answers to Who is the smartest? Or What did the writer think of Castro? The people involved should have said I don’t know. Instead, they made attributions even though they were informed about the experiment’s details. This shows that the people studying the experiment made the Fundamental Attribution Error.
“The three most difficult words for people to say are I don’t know!” – Jacque Fresco
When you get angry at a flight attendant because the food on board the airplane isn’t made to your taste or if you are screaming at the cashier because you are upset that your bank has increased the interest rate on your loan, you are a victim of correspondence bias. It is not the fault of the employees, yet you scream at them. In the past if you have been on the receiving end you might say It’s not your fault, you are just doing your job. Most people do not have the capacity to stop themselves from making this attribution error. Getting angry at someone who informs you that your loan has not being approved or thanking the one that gives you the positive news of loan approval is another irrational behavior as a result of the correspondence bias. Caucasian people in certain ex-colonial countries might be viewed as smarter than the local indigenous population. All these are examples of the correspondence bias.
This bias explains our tendency to give a thorough explanation as to why ‘we’ behaved a certain way in a situation, in other words, that our behavior was a product of the situation. Nevertheless, we will quickly label other people as if the individual was responsible for making the situation. The mechanism for this is known as ‘Focalism’ i.e. we observe other people’s behavior but not the situation; we see ’them’ doing what we are observing as if they are the most likely cause of their behavior. When we sit back and analyze our own behavior, we might be able to observe the mechanisms that led us to behave in such a way.
When we think carefully about our own behavior, we see the situation. This mechanism is strong enough that we may even attribute the behavior of objects to their ‘personalities.’ This is well demonstrated in the Heider and Simmel experiment. If the reader watches the short experiment video, they might not be able to help but provide irrational reasons for the behaviors of the inanimate objects. This is our tendency to anthropomorphize. We place human values or our own values in other things or beings.
We should be teaching people to say I don’t know what part of his environment generated his behavior, or I don’t have enough information to draw a conclusion on the subject. Being aware of the actor-observer bias may help you look at the evidence before making a declarative statement about something or someone. When I say the evidence I mean the environmental factors that caused the combinations of genes to undergo gene expression to form certain proteins which in turn generated a certain reaction pattern that once reinforced by the environment became a feature of that person’s personality.
Your environment, social standing, and situations you are exposed to are the overriding causes of your behavior. Some argue that your personality plays a significant role in the situations you find yourself in but this hypothesis does not clarify that your personality was shaped by the environment you are raised in. Those who might think that this is a ‘which came first, the chicken or the egg’ argument need to become more familiar with effects of environment on behavior.
All these are ideas that threaten the established system. In most universities, most professors usually do not condemn the outdated values in our current system even though they might be teaching similar ideas like the ones presented above. Jacque Fresco describes this process in his argument about relation to academia. If the justice system had to make Situational Attributions they would have to close down all prisons because their entire philosophy is based on making ‘unsane,’ convenient Dispositional Attributions. They incarcerate people for long periods of time, and this stays on their record, so it’s difficult for them to find a job once they get released thus most being forced back into crime. Next time, they might end up doing something worse and get the death penalty. All this is due to outdated information based on the ‘unsane’ values deduced by philosophers in the 17th and 18th centuries.
Confirmation Bias is the reason you hear people saying, I did some research online, and I have confirmed that vaccines are dangerous for children. This is a tendency to search for or interpret information in a way that confirms one’s preconceptions, leading to statistical errors. This type of bias reveals itself in metaphysical explanations, political and cultural views. This is a phenomenon wherein decision makers such as individuals, pilots, doctors, businessman or political leaders have been shown to actively seek out and assign more weight to evidence that confirms their presuppositions but tend to ignore or under-weigh evidence that would otherwise disprove their presuppositions.
Pilots attempt to mitigate this bias by asking open questions after they have thought of a possible solution. Instead of telling their colleagues the solution, they ask them what they think. For example, Look at the weather ahead, what do you think we should do?; the colleague might say I suggest we climb to avoid most of the stratocumulus cloud. We should also deviate up to 40nm left off our track since the wind is coming from the left, so we don’t get the turbulence from the cumulonimbus cloud. But just in case, let’s put the seat belt sign on and warn the cabin crew. If the other colleague doesn’t have anything to add, they disregard their need for participation and plainly say that’s a good idea! While performing the appropriate action. In other words, they don’t seek confirmation of their thoughts, but they ask open questions to possibly hear something that they didn’t think of or something they weren’t aware of, such as the direction of the wind.
A person who is pro-communism will seek out all the evidence as to why communism is good and will ignore most other factors. A BMW fan will look for all the reasons why BMW is an amazing car compared to Mercedes and place less weight on facts that contradict this view. If this is true, how can we trust opinions when making decisions?
Can we trust our uninformed opinions to help us make efficient decisions when we know that they are the result of so many biases, errors, and influences? The answer, of course, is that we can only do so to a limited extent. A Pitot tube on an airplane can measure static and dynamic pressures to the very last millibar; a computer further corrects these readings for instrument, position and pressure error which will give you a remarkably accurate Calibrated Airspeed Indication. It will further correct for compressibility and density error to give you ‘True Airspeed’ and perform an additional correction for the wind thus providing you with a ‘Ground Speed.’ Many people stalled and died because they used an estimate for measuring airspeed before a more accurate system was introduced in an attempt to solve that problem. Similarly today, sensors can be used to collect data and computers can process that data to help us arrive at decisions more efficiently than our opinions.
When exploring the self-serving bias, we see how we can fail miserably at arriving at decisions without utilizing the scientific method with human and environmental concern. These are some examples of how we can fail:
- Motivated Reasoning: the tendency to come to conclusions that make you feel good;
- Motivated Handling: the tendency for people to attach more significance to facts that match their views and argue that these facts are more important – extensively practiced in political election campaigns;
- Motivated group affiliation: the tendency for an individual or a group to support the success of another individual or a group purely based on the fact that they have succeeded;
- Motivated recall: the tendency for people to remember facts that may reflect well on them more than those that do not. People tend to recall facts and evidence that endorse their assumptions more than those that don’t.
- Self-handicapping: people tend to invoke either real or feigned self-handicaps. Self-handicapping is a cognitive strategy by which people avoid investing effort in the hopes of keeping potential failure from hurting self-esteem.
- Above average effect: people who think that extensive learning about the proposals of The Venus Project and detailed examination of the methodologies used to arrive at a resource-based economy is not necessary, fall victim of the tendency to believe that they have an above average understanding.
- The Dunning–Kruger effect: relatively unskilled persons would suffer an illusory superiority, mistakenly assessing their ability to be much higher than it is.
- And last but not least, the ‘Holier than thou’ effect: the tendency for people to believe they are better than others based on a moralistic perspective.