Jump to content

Psychology Of Intelligence Analysis


Skeddan

Recommended Posts

Posted

Heuer's classic work 'The Psychology of Intelligence Analysis' is available online from the CIA website:

 

https://www.cia.gov/library/center-for-the-...ysis/index.html

 

Anyone who thinks they make rational objective judgements would do well to read this. It is an eye-opener to some of the traps and pitfalls that people naturally make when assessing evidence and making evaluations. (including 'vain brain' assumptions that they are are not prone to errors of judgement like other people).

 

Among the things covered are how we fall into mind-sets that are hard to break, especially when these become commonly held popular beliefs, and cognitive biases which make it hard to see what is there (Emperor's new clothes syndrome).

 

The final chapter is particularly interesting in recommended methodologies - see discussion of 'Evaluating Hypotheses' and 'Selecting the Most Likely Hypothesis'. ("The most likely hypothesis is usually the one with the least evidence against it, not the one with the most evidence for it.").

 

The Foreward by Douglas MacEachin is worth reading also. It gives a reminder that these epistemic tools are ones developed to inform policy making in the arena where the price to pay for failures of intelligence can be very high. It also reminds one that in the public and political arena, promoting reputations and being seen to be sober and sensible and not wanting to be associated with notions outside the accepted norms of 'groupthink' are some of the many traps people fall prone to. That alone leads to scoffing and ignoring anything that does not fall within the mental model - the paradigm trap that is so easy to spin.

 

This of course applies not just to the work of the security services, but is applicable to a broad range of fields. These fixed mental models can easily dominate (e.g. taking it as a 'given' that Manx farming cannot survive without subsidies), and frequently these assumptions are made without any basis and often remain unquestioned - often merely because it is 'accepted' that this is so (and it may seem laborious and pointless to question such things).

 

This short very readable work addresses many of the key issues which lead to failures, mistakes, errors and misjudgements, and does so in a very accessible way. For those who quickly jump to rubbishing things that don't fit their fixed mental models, it is well worth reading. For those who think they are rational and have an open mind, and aren't prone to falling into such traps, it is well worth reading.

Posted

I'm reading the Black Swan by Nassim Nicholas Taleb at the moment - which seems to be saying very similar things, but more from a financial/economic angle. He catigates financial analysts and economists who claim to be able to create an explanatory narrative on the past and then give boundaries for future events - ie prediction of say economic growth, or growth in DVD sales between X% and Y% per year. He says they are basically just making up stories full of confirmation bias and narrative fallacies and the future is inherently unpredicatable due to Black Swans - unpredictable random events with major consequences- ie 911, the collapse of Lehman brothers, the fall of the Berlin Wall, or the success of the Harry Potter books and movies!

Posted

There is a tendency for humans to want to polarise things e.g. right v wrong, muslim v christian, terrorist v non-terrorist, rich v poor, left v right, smoker v non-smoker, even 'nothing to fear, nothing to hide' (innocent v guilty). That has never been more so in the last 8 years under Bush/Blair and the Neocons and their 'you're either with us or against us' attitude. IMO this has led to a lot of people 'seeing' what they want to see, including over simplistic classifications and over simplistic risk assessments - which are far easier to categorise with just two views, whilst simply choosing to ignore the fact that many such polarised views actually represent false dichotomies (more complex than just two views). An analysis of this pattern of thinking is well overdue, as it has not only impacted world events with the invasion of Iraq etc. it has also over this period crept into the world of finance and our economies to our detriment. But I fear it is too late.

 

Human behaviour does demonstrate much predictability e.g. people make money out of technical analysis of stocks/the stock market, and psychologists make money offering their services analysing patterns of human behaviour. Even politicians know that to get positive attention these days, they only need to talk on a polarised topic, such as prisons, smoking or youth drinking - it suits politicians that we tend to have polarised views because it makes they're lives far easier and straightforward - and means they don't need to be so articulate and have to answer difficult questions.

 

The reality is that most people that make money live in the everyday land in between these polarised views, and are not very interested in politics unless they need to lobby on behalf of their business, take little notice of the media except the business section perhaps, and their altruism is more often shown by a donation to charity rather than working for one. They know there are peaks and troughs, and when the troughs come they are usually far better prepared than most, sometimes even engineering troughs, whilst leaving other 'lowly' people to have to face the dole queue. Troughs are predictable in many ways to the canny, because most times they have revolved around over priced assets. There will always be additional troughs caused by the odd major event, but even to some degree these too can be predicated or prepared for - if only in the sense of 'not putting all your eggs in one basket'. Governments know all about peaks and troughs, and to pretend otherwise is naive.

 

To me, this report is only a long winded way of saying: 'time to think outside of the box' and move away from the two-position-only argument. What this poor type of thinking does, is tend to drive the industries and complexes behind it: the politicians, the security services, the security industry, the military and the media, and therefore us...and it is the fundamental basis of 'the politics of fear', whether that fear is about terrorism, health or improbable threats/events etc. Politicians and lobbyists in recent years have begun to milk, even encourage, this poor thinking, along with using our general apathy and inarticulation against us in order to get their own way, often without seeking compromise and at the expense of the civil rights of others including their own people. The trouble is that we now live in an effective dumbocracy which has been engineered over the past few decades, so one question I would have to ask is...

 

...'Given the political, industrial and media complexes we have allowed to develop, do we actually have the ability or power to change our ways of thinking anymore?'.

Posted
To me, this report is only a long winded way of saying: 'time to think outside of the box' and move away from the two-position-only argument.

AT Read it - or at least the summary in the short Introduction. It is more than that - it is easy to say 'time to think outside the box' - it is much harder to actually identify the traps that people fall into and how these can be avoided.

 

A lot of people would agree that need to think outside the box - but in reality this rarely happens. Many people think they are making perfectly reasonable sound sober judgements when often they might not be doing so.

 

To take example of 'analysis' - consider arguments for constitutional relationship with UK (something that is uncertain, and which is fairly important in conduct of Manx affairs). Intelligent rational people who think they are objective and open-minded have blindly insisted on an 'accepted' view which is based on nothing but assumption and unsubstantiated say-so by not very good historians. All the contradictory evidence is brushed aside and ignored (maybe after some minimal effort to explain this away). Cognitive dissonance often then comes into play - resulting in irritation and annoyance at challenge to the mental model. It can even go to the extent of 'we don't need to take account of contradictory evidence as no doubt there will be some satisfactory explanation' - or alternatively to give an offhand dismissal of 'does not merit serious consideration' (thus ducking any difficulties entirely).

 

You can't think outside your brain (other than discussing with people with different views rather than engage in groupthink). Even then there are limitations. Understanding the psychology of how we process information and the traps we fall into helps avoid the limitations of our heavily biased vain brains. Otherwise 'thinking outside the box' is just verbiage. IMO it's not even thinking out of the box - it is about the psychological impediments to analytic epistemology and how to overcome these.

 

And - as further example - you jumped to conclusions about what this work was about and chose to understand through the lens of your own views as expressed in your post. It's almost as if you didn't have to read it - you'd already made your mind up. All decision making processes are prone to such errors, but when important matters are involved, falling prey to this easily leads to serious failures. Decisions are then driven by the strength of opinion of who has the loudest voice. Going to war in Iraq - not an intelligence failure, but a failure of intelligence.

Posted
To me, this report is only a long winded way of saying: 'time to think outside of the box' and move away from the two-position-only argument.

AT Read it - or at least the summary in the short Introduction. It is more than that - it is easy to say 'time to think outside the box' - it is much harder to actually identify the traps that people fall into and how these can be avoided.

My comments were more general about how Joe Average tends to think and make decisions, based on the boundaries to thinking that have been imposed on him in recent decades with regards to politics and the media.

 

i.e. Joe Average will expect others to deal with the complexities of the analyses, then be all too willing to tend to accept the political and media representation of it all and face a polarised decision e.g. Saddam - good guy, bad guy, WMD - yes, no, Iran - nukes, no nukes, Iran - attack, not attack. All this regardless of what an analysis says, and how it is was undertaken, and how improbable/probable it is. The human tendency is to look for a yes/no, whilse the politicians and the media cherry pick what they want us to hear to emphasise that yes/no.

 

So it is not just about the detail of undertaking the analysis, it is what is allowed to be subsequently done/undone with the anlysis, and how it is presented that needs a revaluation checklist too.

Archived

This topic is now archived and is closed to further replies.

  • Recently Browsing   0 members

    • No registered users viewing this page.
×
×
  • Create New...