85
What prevents us from being objective: 11 cognitive biases
Cognitive distortions These are systematic errors in human thinking, a kind of logical trap. In certain situations, we tend to follow irrational patterns, even when we think we are using common sense.
The illusion of control
People tend to overestimate their influence on events in the successful outcome of which they are interested. This phenomenon was discovered in 1975 by the American psychologist Ellen Langer during experiments with lottery tickets. Participants in the experiment were divided into two groups: people from the first group could choose their own lottery tickets, and members of the second group were given them without the right to choose. 2 days before the draw, the experimenters offered participants of both groups to exchange their ticket for another, in a new lottery with greater chances of winning.
Obviously, the offer was profitable, but those who chose their own tickets were slow to part with them - as if their personal choice of ticket could affect the likelihood of winning.
Preference for zero risk
Imagine that you have a choice: reduce a small risk to zero or significantly reduce a large risk. For example, reduce to zero plane crashes or dramatically reduce the number of car accidents. What would you choose?
Based on the statistics, it would be better to choose the second option: the death rate from plane crashes is much lower than the death rate from car accidents – so this choice will save many more lives. Still, research shows that most people choose the first option: zero risk at least in some areas looks more calming, even if your chances of becoming a victim of a plane crash are negligible.
Selective perception
Suppose you don't trust GMOs. If you’re interested in this topic, you’re probably reading news and articles about genetically modified organisms. Reading, you become more and more convinced that you are right: there is a danger. But here's the catch: chances are you're paying much more attention to news that reinforces your point of view than to arguments for GMOs.
So you lose your objectivity. This tendency of people to pay attention to information that fits their expectations and ignore everything else is called selective perception.
Player error
The mistake of the player most often lies in wait for gambling fans. Many of them try to find a relationship between the probability of the desired outcome of a random event and its previous outcomes.
The simplest example is with the flip of a coin: if the tails fall nine times in a row, most people will next bet on the eagle, as if too often the tails fall out increases the likelihood of it falling out. In fact, the odds are the same – 50/50.
Systematic Survivor Mistake
This logical trap was discovered during the Second World War, but you can get into it in peacetime. During the war, the US military leadership decided to reduce the number of losses among bombers and issued an order: according to the results of the battles, to find out on which parts of the aircraft it is necessary to strengthen the defense. They began to study the returning aircraft and found many holes on the wings and tail - these parts were decided to strengthen.
At first glance, everything seemed quite logical - but, fortunately, an observational statistician Abraham Wald came to the aid of the military. He told them that they almost made a mistake. After all, in fact, the holes in the returning planes carried information about their strengths, not their weaknesses. Aircraft that were “wounded” elsewhere, such as an engine or fuel tank, simply did not return from the battlefield.
The Wounded Survivors principle is worth thinking about now, when we are about to jump to conclusions based on asymmetrical information on any two groups.
The illusion of transparency
You are in a situation where lying is necessary. But how difficult it is to do – you think that you are seen through and any involuntary movement will betray your insincerity. Familiar? This “illusion of transparency” is the tendency of people to overestimate the ability of others to understand their true motives and experiences.
In 1998, psychologists conducted an experiment with students at Cornell University. Individual students read questions from cards and answered them, telling the truth or lying depending on the directions on the card. Audiences were asked to determine when speakers were lying, and speakers were asked to rate their chances of fooling others. Half of the liars assumed they would be caught - in fact, only a quarter were exposed. This means that the liars greatly overestimated the insight of their listeners.
Why is this happening? Probably because we know too much about ourselves. Therefore, we think that our knowledge is obvious to an external observer. But the illusion of transparency works the other way around: we overestimate our ability to recognize other people’s lies.
The Barnum Effect
A common situation: a person reads and stumbles upon a horoscope. Of course, he doesn’t believe in all these pseudosciences, but he decides to read the horoscope purely for fun. But the strange thing is that the characteristic of a suitable sign coincides very precisely with his own ideas about himself.
Such things happen even to skeptics: psychologists called this phenomenon the "Barnum effect" - in honor of the American showman and clever manipulator of the XIX century Finneas Barnum. Most people tend to perceive rather general and vague descriptions as accurate descriptions of their personality. And, of course, the more positive the description, the more coincidences. This effect is used by astrologers and fortune tellers.
The effect of self-fulfilling prophecy
Another cognitive distortion that works for soothsayers. Its essence is that an untrue prophecy, which sounds convincing, can cause people to involuntarily take steps to fulfill it. In the end, a prophecy that objectively had little chance of being fulfilled suddenly turns out to be true.
The classic version of such a prophecy is described in Alexander Green’s novel “Scarlet Sails”. Fictionist Egl predicts little Assol that when she grows up, the prince will come for her on a ship with scarlet sails. Assol fervently believes in the prediction and becomes known throughout the city. And then, falling in love with the girl, Captain Grey learns about the prophecy and decides to fulfill Assol’s dream. And in the end, Egl turns out to be right, although the happy ending in history was provided by far from fabulous mechanisms.
A fundamental attribution error
We tend to explain the behavior of other people by their personal qualities, and our actions by objective circumstances, especially if we are talking about some mistakes. For example, another person is probably late due to his/her non-punctuality, and his/her lateness can always be explained by a spoiled alarm clock or traffic jams.
And this is not only about official justifications, but also about the internal vision of the situation – and this approach prevents us from taking responsibility for our actions. So those who want to work on themselves should remember that there is a fundamental attribution error.
The Effect of Moral Trust
Known for his liberal views, the journalist caught homophobia, the priest took a bribe, and the senator, advocating for family values, was photographed in a strip bar. In these seemingly out-of-the-box cases, there is a sad pattern — it is called the “moral trust effect”. If a person develops a solid reputation as a “righteous man,” at some point he may have the illusion that he is truly sinless. And if he's that good, a little weakness won't change anything.
Cascade of available information
The cognitive distortion that all the world’s ideologues owe to success: collective belief in an idea becomes much more persuasive if the idea is repeatedly repeated in public discourse. We often encounter it in conversations with grandmothers: many pensioners are confident in the truth of everything that is often said on television. The next generation will probably feel this effect through Facebook. published
P.S. And remember, just changing our consumption – together we change the world!
Join us on Facebook, VKontakte, Odnoklassniki
Source: vk.com/feed?w=wall-23611958_141802
The illusion of control
People tend to overestimate their influence on events in the successful outcome of which they are interested. This phenomenon was discovered in 1975 by the American psychologist Ellen Langer during experiments with lottery tickets. Participants in the experiment were divided into two groups: people from the first group could choose their own lottery tickets, and members of the second group were given them without the right to choose. 2 days before the draw, the experimenters offered participants of both groups to exchange their ticket for another, in a new lottery with greater chances of winning.
Obviously, the offer was profitable, but those who chose their own tickets were slow to part with them - as if their personal choice of ticket could affect the likelihood of winning.
Preference for zero risk
Imagine that you have a choice: reduce a small risk to zero or significantly reduce a large risk. For example, reduce to zero plane crashes or dramatically reduce the number of car accidents. What would you choose?
Based on the statistics, it would be better to choose the second option: the death rate from plane crashes is much lower than the death rate from car accidents – so this choice will save many more lives. Still, research shows that most people choose the first option: zero risk at least in some areas looks more calming, even if your chances of becoming a victim of a plane crash are negligible.
Selective perception
Suppose you don't trust GMOs. If you’re interested in this topic, you’re probably reading news and articles about genetically modified organisms. Reading, you become more and more convinced that you are right: there is a danger. But here's the catch: chances are you're paying much more attention to news that reinforces your point of view than to arguments for GMOs.
So you lose your objectivity. This tendency of people to pay attention to information that fits their expectations and ignore everything else is called selective perception.
Player error
The mistake of the player most often lies in wait for gambling fans. Many of them try to find a relationship between the probability of the desired outcome of a random event and its previous outcomes.
The simplest example is with the flip of a coin: if the tails fall nine times in a row, most people will next bet on the eagle, as if too often the tails fall out increases the likelihood of it falling out. In fact, the odds are the same – 50/50.
Systematic Survivor Mistake
This logical trap was discovered during the Second World War, but you can get into it in peacetime. During the war, the US military leadership decided to reduce the number of losses among bombers and issued an order: according to the results of the battles, to find out on which parts of the aircraft it is necessary to strengthen the defense. They began to study the returning aircraft and found many holes on the wings and tail - these parts were decided to strengthen.
At first glance, everything seemed quite logical - but, fortunately, an observational statistician Abraham Wald came to the aid of the military. He told them that they almost made a mistake. After all, in fact, the holes in the returning planes carried information about their strengths, not their weaknesses. Aircraft that were “wounded” elsewhere, such as an engine or fuel tank, simply did not return from the battlefield.
The Wounded Survivors principle is worth thinking about now, when we are about to jump to conclusions based on asymmetrical information on any two groups.
The illusion of transparency
You are in a situation where lying is necessary. But how difficult it is to do – you think that you are seen through and any involuntary movement will betray your insincerity. Familiar? This “illusion of transparency” is the tendency of people to overestimate the ability of others to understand their true motives and experiences.
In 1998, psychologists conducted an experiment with students at Cornell University. Individual students read questions from cards and answered them, telling the truth or lying depending on the directions on the card. Audiences were asked to determine when speakers were lying, and speakers were asked to rate their chances of fooling others. Half of the liars assumed they would be caught - in fact, only a quarter were exposed. This means that the liars greatly overestimated the insight of their listeners.
Why is this happening? Probably because we know too much about ourselves. Therefore, we think that our knowledge is obvious to an external observer. But the illusion of transparency works the other way around: we overestimate our ability to recognize other people’s lies.
The Barnum Effect
A common situation: a person reads and stumbles upon a horoscope. Of course, he doesn’t believe in all these pseudosciences, but he decides to read the horoscope purely for fun. But the strange thing is that the characteristic of a suitable sign coincides very precisely with his own ideas about himself.
Such things happen even to skeptics: psychologists called this phenomenon the "Barnum effect" - in honor of the American showman and clever manipulator of the XIX century Finneas Barnum. Most people tend to perceive rather general and vague descriptions as accurate descriptions of their personality. And, of course, the more positive the description, the more coincidences. This effect is used by astrologers and fortune tellers.
The effect of self-fulfilling prophecy
Another cognitive distortion that works for soothsayers. Its essence is that an untrue prophecy, which sounds convincing, can cause people to involuntarily take steps to fulfill it. In the end, a prophecy that objectively had little chance of being fulfilled suddenly turns out to be true.
The classic version of such a prophecy is described in Alexander Green’s novel “Scarlet Sails”. Fictionist Egl predicts little Assol that when she grows up, the prince will come for her on a ship with scarlet sails. Assol fervently believes in the prediction and becomes known throughout the city. And then, falling in love with the girl, Captain Grey learns about the prophecy and decides to fulfill Assol’s dream. And in the end, Egl turns out to be right, although the happy ending in history was provided by far from fabulous mechanisms.
A fundamental attribution error
We tend to explain the behavior of other people by their personal qualities, and our actions by objective circumstances, especially if we are talking about some mistakes. For example, another person is probably late due to his/her non-punctuality, and his/her lateness can always be explained by a spoiled alarm clock or traffic jams.
And this is not only about official justifications, but also about the internal vision of the situation – and this approach prevents us from taking responsibility for our actions. So those who want to work on themselves should remember that there is a fundamental attribution error.
The Effect of Moral Trust
Known for his liberal views, the journalist caught homophobia, the priest took a bribe, and the senator, advocating for family values, was photographed in a strip bar. In these seemingly out-of-the-box cases, there is a sad pattern — it is called the “moral trust effect”. If a person develops a solid reputation as a “righteous man,” at some point he may have the illusion that he is truly sinless. And if he's that good, a little weakness won't change anything.
Cascade of available information
The cognitive distortion that all the world’s ideologues owe to success: collective belief in an idea becomes much more persuasive if the idea is repeatedly repeated in public discourse. We often encounter it in conversations with grandmothers: many pensioners are confident in the truth of everything that is often said on television. The next generation will probably feel this effect through Facebook. published
P.S. And remember, just changing our consumption – together we change the world!
Join us on Facebook, VKontakte, Odnoklassniki
Source: vk.com/feed?w=wall-23611958_141802
Why Christ died
Serbian graduates instead of buying outfits gave the money to sick children and celebrated in t-shirts