Why are we sometimes over-confident about our chances of success and sometimes not confident enough?
Short permalink to this article: https://bit.ly/2V8F7qw
Louis Levy-Garboua*, Muniza Askari and Marco Gazel
People often underestimate their chances of succeeding in a difficult task but underestimate them when it is an easy one. Psychologists call this phenomenon the “hard-easy effect”. Is this behaviour due to a “cognitive bias” inherent in a limited rationality? Or does it reveal a temporary learning phase among rational individuals who do not know a priori their real capacities for succeeding at a new task?
In this article, Lévy-Garboua, Askari and Gazel answer the question by offering a model of “intuitive rationality” that integrates the two and predicts several apparently irrational behaviours, such as the “hard-easy effect”. They also use rich experimental data that reveals these behaviours and confirms the predictions of the model.
The authors developed an experiment that resembles a game of double or quits and which measures whether self-confidence grows faster or slower than understanding. The players accomplish a task - solving anagrams - whose degree of difficulty increases in three increments. The players first try to reach the first level and if they do, they are offered “double or quits”. Success at each level rewards the players with ever-greater wins. Nevertheless, a failure – more and more likely ¬ – leads to weaker gains than those that would be attained had they quit the game earlier. The data collected shows that the first, easy successes generate over-confidence in the players, and they do not learn their real level of aptitude. Too often, they do not stop in time and thus suffer great losses. To illustrate, the sample of players who continued on to play at a higher level can be divided into four categories: 47% are capable and well calibrated, 12% are well calibrated but not capable, 36% are over-confident, and 5% under-confident. However, their respective failure rates are very different: only 52% among the well calibrated and capable and 57% for the under-confident, compared with 78% among the less capable and well calibrated, and 91% among the over-confident.
The explanation of these findings lies in the idea that individuals, not having an accurate idea of their chances of success at a new task, estimate them rationally – that is to say, following Bayes’ theorem, strictly according to the facts and signals they perceive. However, these signals are often subjective and fragile, and include, given the doubt that the individuals are steeped in, the objections they have to their own beliefs. Thus, an individual who is almost sure of succeeding in an easy (for her) task will lower her estimation of her chances of success after having foreseen the possibility of failure; and conversely, she who is almost sure of failing will become more optimistic after having foreseen the possibility of success. This swing triggers a “hard-easy effect” as well as other observable behaviour such as a limited power to discriminate (incapable of seeing moderate differences between two options), the tendency to overestimate the accuracy of her evaluation (another aspect of over-confidence), the tendency to under-react to signals (inertia) and systematically miscalculating the effort required (planning error).
Original title of the article: Confidence Biases and Learning among Intuitive Bayesians
Published in: Theory and Decision (2018) 84, 453-482
Available at: https://www.researchgate.net/profile/Louis_Levy-Garboua
Photo credit: Prostock-studio (Shutterstock)
* PSE Member