ReasonINC

Reason; as the supreme authority in matters of opinion, belief, or conduct

Category: Heuristics

Prisoner’s dilemma finally probed

Prisoner’s dilemma finally probed

Game theory has previously told us that, regrettably, the best strategy in the classic ‘prisoner’s dilemma’ problem is betrayal (‘prisoner’s dilemma’ offers two individuals the same choice: cooperate for an equal punishment, or betray. If you betray a cooperating partner, they are punished more severely and you less, but if you both betray you are both punished more harshly). Mathematically, then, our ‘selfish and aggressive instincts’ maximize our chances of survival or victory – unsurprising given the exquisite rigour of evolution in sorting variation for superiority.

For the dismayed rationalist wondering how humans can ever live sustainably and compassionately there is solace in the fact that our evolution moved into a crucial second phase where group selection ruled. Humans lived in packs, and the most successful packs dominated resources. This positively selected for packs that functioned well. Knowing this, it is easy to begin to dissect human behaviour all over again. We conform religiously to the accepted ‘truths’ of the group, to group ideals and values. We have amongst us a brotherhood; in certain circumstances we will transcend survival instincts or solipsistic practice to protect shared interests. Groups with these individuals were far superior to ones without. It is worth noting, of course, that we will, without hesitation, send a competitor group up the river (rival businesses or sports teams etc).

With this in mind, these surprising results from the University of Hamburg should be a little less surprising. Testing the ‘prisoner’s dilemma’ concept on actual prisoners, they found that there was more cooperation than a purely mathematical strategy would dictate. Fifty six per cent of prisoners opted to cooperate, yielding an end result of 30% of total pairs cooperating.

(http://www.sciencedirect.com/science/article/pii/S0167268113001522#)

A short essay on conspiracy theorists

This short essay, posted on reddit (link below), highlights quite well the flawed thinking of most conspiracy theorists who jump to the conclusion of foresight and group complicity by the powers they accuse. Even though I don’t know nearly enough to verify this writer’s facts, their arguments are nonetheless clear.

There is a common failure of assessment that arises in many situations involving a combination of simplification and an overbearing assumption of causality. These two heuristics are very common in the misrepresentation of problems and are easy to understand in the context of what brain traits gave the most profound evolutionary advantages in our past. Quick responses often reward an advantage over more accurate understanding. That is, an organism that can assess a problem and produce a solution faster often gets the food or mate or victory. Even if they are sometimes wrong, they are also sometimes successful, and this is commonly achieved through the simplification of problems in their perception. Our brains also have a bias that tries to establish causal relationships. When the cause of an observed event is not understood, some of the brain’s capacity is still occupied as it tries to resolve what it doesn’t yet understand, and therefore cannot rule out as a threat. In the pursuit of this end, brains that were more cavalier, and less critical, about establishing cause were able to move on to solve new problems or execute responses. The causes they established might be wrong, but they weren’t shackled by philosophical reflection on their intrinsic uncertainty; the ones that too crudely established cause in mortal situations died out until later generations had found a balance, etc.

These simple flaws are retained in our biology and now undermine our most impressive feats of abstract reasoning. They can only be overcome through humble self-reflection to understand the mistakes your brain is likely to make; like simplification or the desire to establish cause. It takes discipline to constantly remind yourself that any feeling of certainty is extremely foolish, and that your feeling of understanding may be, and in fact is far more likely to be, based on quite significant simplifications and misrepresentations.

(http://www.reddit.com/r/NeutralPolitics/comments/1escrl/conspiracists_understand_the_

primacy_of_ideas/ca3cihx)

“The general population doesn’t know what’s happening, and it doesn’t even know that it doesn’t know” – Noam Chomsky

Cognitive biases 002 – The overconfidence effect

Cognitive biases 002 – The overconfidence effect

Introduction (https://reasoninc.wordpress.com/2012/11/05/introduction-to-a-series-on-cognitive-biases/)

Perhaps one of the cognitive biases we can, or should, all relate to the most (for the reflective individual) is that of unfounded confidence. Evident too often each day, each conversation, to possibly count; this bias manifests in front of our eyes, in others and in ourselves, moment after moment, assertion after assertion. We feel certainty in our beliefs to the extent that we allow ourselves to become emotional in their defence, and even less rational still as we stand by views often unreceptive to new information or logic. How often do we make fools of ourselves as we lay down pragmatic thought to argue with those around us as if to prove our intelligence, our wisdom – what irony that practice carries. All of us have been shown the fallacy of our ways on enough occasions that it is to our sincere discredit that we still will stand by our opinions with such strength, and with such vanity, that to the keen observer far too many disputes are simply a primal clash of egos.

How frequently will we not only arrest our own efforts of analysis, of self-scrutiny, too early; but will also wager our credibility on the certainties we have that our ‘opinions’ are correct. Clearly, forgetting that inducting philosophy of wisdom; that we “know nothing except the fact of our ignorance,” is a practice we engage in interaction after interaction. The maturity and wisdom shown by those who have the strength of character to second-guess their views, particularly when those views are public, and go on to admit they are wrong, is something we should all aspire to.

Luckily these days we are able to support these philosophies empirically. Socrates had no such data to go on when he stood alone in asserting the wisdoms that have been held in the highest regard by the greatest minds that followed him, to his unending credit. On the one hand his commitment to logic was exceptional, and on the other it was hugely brave considering that, then more than now, challenging the fear-alleviating beliefs of his brothers and sisters carried the risk of death or worse.

The overconfidence effect describes the difference between the certainty that we feel, to ourselves or openly to others, and the actual accuracy of our views. This bias, like many, will manifest differently from person to person and across varying circumstances. The bias is often discussed as one of the most dangerous that is prevalent throughout humans given its likely role in financial bubbles, foolish lawsuits and wars and conflict.

There is one specific example that paints very elegantly how absurd our confidences are. Participants answering quiz questions, and giving confidence ratings, were wrong 40% of the time that they reported feeling 99% certain of their answer. Indecision is final; and probably the organisms that practiced this old cliché were left behind by those that acted on their first impressions; instantly, without contemplation. In the world of today, though, how foolish they seem.

(http://en.wikipedia.org/wiki/Overconfidence_effect)

Cognitive biases 001 – The illusion of control

Introduction (https://reasoninc.wordpress.com/2012/11/05/introduction-to-a-series-on-cognitive-biases/)

Following the ground-breaking work of Ellen Langer in 1975, we have come to understand the illusion of control as a cognitive bias that manifests as the perception that we have more control over some events than we actually do. One of my favourite examples of this bias looked at gamblers trying to throw the combined dice value they needed in a game of pure chance. When a person needed a high number they threw the dice harder; and softer when they needed a lower value. Other studies have shown individuals to believe they could improve their ability to predict a coin toss with practice (44% of participants believed this).

Like all cognitive biases this phenomenon must have served a functional advantage and one could speculate for hours as to the benefit here – of which many suggestions may be correct. Perhaps the still-evolving human, living in an unforgiving, survival-of-the-fittest world, who believed that they could control events that were beyond their influence; would persevere longer. Maybe these organisms chanced their way through danger they couldn’t control but would assert themselves at the very first instant they could because they had never believed that they couldn’t make a difference. Conversely, however, in situations where we have a lot of control we have been shown to underestimate it; perhaps suggesting that theories attributing responsibility to heuristics in the mechanisms by which we link goals to outcomes are correct. The bias has been shown to strengthen in stressful and competitive situations, which should be taken into account in its interpretation (the implications to financial markets I’m sure speak out on their own).

(http://en.wikipedia.org/wiki/Illusion_of_control)

(http://www.francescagino.com/uploads/4/7/4/7/4747506/ginosharekmoore_obhdp_2011.pdf)

Introduction to a series on cognitive biases

One of the many great banana skins for humans that sits right under our noses, but that we emphatically fail to see, is how much we will infer from the visual appearance of others. Founded by the work of Amos Tversky and Daniel Kahneman; one of the most fascinating areas of psychology to emerge over the last four decades has been that of the cognitive biases that we apply in our mental processing; the shortcuts we take while we relentlessly assess the world around us. Never more than a step from disaster or profound success; there has always been an unquestionable pressure on our brains to produce solutions that aid us in our actions. From this premise it is important to understand, without any doubt, that the brain has always valued ‘any’ information in the absence of reliable information. The brain with something to go on outlives the brain that has appreciated the most frequently enunciated philosophies of Socrates; that we certainly can’t be certain of anything.

These phenomena are our cognitive biases, ranging from a natural fear of unfamiliar things; to the bandwagon effect where we believe more readily that which is accepted by the people around us. In the no rules fight-to-the-death battleground that has shaped us through our evolution, these flawed but instant cues served us for the better as we could shape solutions to novel problems quickly – if not always reliably.

Today, however, these biases only detract from the decision making we engage in. At best the gut feelings, or rash conclusions, that we act on might be correct; but at worst we are confident without due reason. Our higher conscious brain allows us the powers of contemplation necessary to gain critical distance and second guess the mistakes our brains make when we don’t stop to consider. It would be no great feat to list for pages the evils committed on the certainty of flawed thinking. To continue on to list the bad business decisions made today alone based on the shortcuts our minds are taking would be an achievement by no definition, and a task not possible to document as quickly as new cases arose, I don’t doubt.

A scan over the Wikipedia page listing the reproducibly proven biases we apply, that I include below, is I hope enough to stop any individual short in utter disbelief at the suddenly clear errors we all make; if they weren’t already clearly visible to you as you considered irrational actions in yourself or in others.

To punctuate this point I will call on an example right from the top: The proven inferences we make of competence from the faces of others. A recent study (by Alexander Todorov; Anesu N. Mandisodza; Amir Goren; and Crystal C. Hall at Princeton University) showed that judgements of competence based on facial appearance alone could predict the outcome of US elections better than chance (68.8%). The margin of victory also scaled linearly with such judgements. Can the decisions we make on a topic debated so heatedly by so many – to my thorough exhaustion – be predicted by utterly irrelevant information? Well, yes, it can be…

(http://en.wikipedia.org/wiki/List_of_biases_in_judgment_and_decision_making)

(http://www18.homepage.villanova.edu/diego.fernandezduque/Teaching/SocialNeuroSeminar/faces/Todorov_Science2005.pdf)

Hindsight Bias

Hindsight Bias

Hindsight bias is a phenomenon where we believe, after an answer to a question has been given, that we knew it all along. Modern psychology tells us that we edit our memories in many different ways, and this also changes with age. This very accessible study (by Daniel M. Bernstein, Edgar Erdfelder, Andrew N. Meltzoff, William Peria, and Geoffrey R. Loftus) shines light on this hindsight heuristic with clearly understandable experiments and also examines its changing character over a representative lifespan.