Reason; as the supreme authority in matters of opinion, belief, or conduct

Category: Kahneman

Introduction to a series on cognitive biases

One of the many great banana skins for humans that sits right under our noses, but that we emphatically fail to see, is how much we will infer from the visual appearance of others. Founded by the work of Amos Tversky and Daniel Kahneman; one of the most fascinating areas of psychology to emerge over the last four decades has been that of the cognitive biases that we apply in our mental processing; the shortcuts we take while we relentlessly assess the world around us. Never more than a step from disaster or profound success; there has always been an unquestionable pressure on our brains to produce solutions that aid us in our actions. From this premise it is important to understand, without any doubt, that the brain has always valued ‘any’ information in the absence of reliable information. The brain with something to go on outlives the brain that has appreciated the most frequently enunciated philosophies of Socrates; that we certainly can’t be certain of anything.

These phenomena are our cognitive biases, ranging from a natural fear of unfamiliar things; to the bandwagon effect where we believe more readily that which is accepted by the people around us. In the no rules fight-to-the-death battleground that has shaped us through our evolution, these flawed but instant cues served us for the better as we could shape solutions to novel problems quickly – if not always reliably.

Today, however, these biases only detract from the decision making we engage in. At best the gut feelings, or rash conclusions, that we act on might be correct; but at worst we are confident without due reason. Our higher conscious brain allows us the powers of contemplation necessary to gain critical distance and second guess the mistakes our brains make when we don’t stop to consider. It would be no great feat to list for pages the evils committed on the certainty of flawed thinking. To continue on to list the bad business decisions made today alone based on the shortcuts our minds are taking would be an achievement by no definition, and a task not possible to document as quickly as new cases arose, I don’t doubt.

A scan over the Wikipedia page listing the reproducibly proven biases we apply, that I include below, is I hope enough to stop any individual short in utter disbelief at the suddenly clear errors we all make; if they weren’t already clearly visible to you as you considered irrational actions in yourself or in others.

To punctuate this point I will call on an example right from the top: The proven inferences we make of competence from the faces of others. A recent study (by Alexander Todorov; Anesu N. Mandisodza; Amir Goren; and Crystal C. Hall at Princeton University) showed that judgements of competence based on facial appearance alone could predict the outcome of US elections better than chance (68.8%). The margin of victory also scaled linearly with such judgements. Can the decisions we make on a topic debated so heatedly by so many – to my thorough exhaustion – be predicted by utterly irrelevant information? Well, yes, it can be…



Hindsight Bias

Hindsight Bias

Hindsight bias is a phenomenon where we believe, after an answer to a question has been given, that we knew it all along. Modern psychology tells us that we edit our memories in many different ways, and this also changes with age. This very accessible study (by Daniel M. Bernstein, Edgar Erdfelder, Andrew N. Meltzoff, William Peria, and Geoffrey R. Loftus) shines light on this hindsight heuristic with clearly understandable experiments and also examines its changing character over a representative lifespan.