The deductive paradox: finding what you look for

EMPAC is hugely supportive of practitioner research, for many reasons. Firstly, professional context is effectively ‘built-in’, meaning there is usually high relevance to real-word professional problems.   Secondly, the motivation is usually to make a difference rather than simply observe and theorise, meaning impact and application are likely to be stronger.  Thirdly, it rubs off: practitioner researchers act as a form of progressive change agent and enthuse others, creating a snowball growth for research and innovation.

But. Despite being big fans of practitioner research, there is an issue that needs confronting, as it might weaken the approach. All that is required is explicit awareness of the problem, and once that is sorted out, we’re back on track and stronger than ever, by keeping an open mind.

What is this problem? The deduction – induction spectrum. In English? OK, so deduction is having an idea early on that offers a suggested interpretation or even explanation, a kind of hypothesis, that you may then test to prove or disprove it. It’s as simple as arriving at the scene of a crime and thinking ‘this looks like the work of X’.

Induction is more of a suspension of judgment, where you maintain an open mind for longer and see what the circumstances offer up and make your working conclusions based on what’s emerged. Both forms live on a kind of spectrum, where there are extreme forms of either, but where they can get quite close to each other too. Both involve interpretation of what you’re seeing; deduction just means you back your horse earlier on.

At one end of the spectrum – in extreme deductive thinking – you might have made your mind up before you get to a scene based on the type of incident you’ve been told it is. At the other – in more extreme inductive thinking – you may not make your mind up even though there is lots and lots of evidence suggesting an obvious explanation. In between is where a lot of us might operate; and that’s about right as the approaches should be complementary and balance.

Decision making processes, for example to solve a problem, are also described as heuristic (the word comes from the Greek: to find or discover). Heuristics are used a lot in policing (although you may not have ever heard that word) as they are about taking a pragmatic method to find a fix to a problem, even where all the information may not be available. Heuristics are sometimes referred to as ‘mental shortcuts’; and that might entail what we could call an educated guess.

What’s this got to do with research? Simply put, in the process of interpretation, if we make conclusions too early, we may influence our search for data. The more you look for something, the more you may well find it. It can end up a form of cognitive bias, as discussed by Kahneman and Tversky (1972). The issue here concerns some of the negative baggage that can come along with heuristics – those mental shortcuts – as we often draw upon personal experience and what we’ve done before, to make quick decisions.

The implications of all this is we can end up with all sorts of bias, particularly if we aren’t alert and monitoring our thinking and judgement making. Confirmation bias, for example, can be like a form of wishful thinking, and is pretty often, as psychologist Raymond Nickerson (1998) points out, linked to early deductive conclusions – a form of ‘hypothesis in hand’. It can mean in practice having ‘favourites’ within the data. You can also find ‘bandwagon bias’, where you follow the crowd, and ‘conservativism bias’, where you might lean towards interpretations you’ve made before. Then there’s anchoring bias, where you might lock yourself onto the first data you find and find it difficult to revise to alter your mind based on other emergent findings. That’s a little like the example from earlier; about being sent to a particular type of incident, because control room have told you what it is, so that is in your mind when you arrive.

Dunning-Kruger bias is a form of overconfidence, compounded by an insular and uncritical culture. Social psychologist Scott Plous (1993) regards this form of overconfidence bias as a type of cognitive malfunction that can result in catastrophic errors in decision making. And the command and control world may be particularly susceptible to this.

Action bias can trigger doing something earlier rather than later, even if what’s being done isn’t really thought through. Coupled with authority bias, there is a strong risk here of getting on a trajectory of action like a run away train that everyone is on just because everyone else is on it too. Someone asking ‘why?’ would be like someone pulling the emergency stop cord.

So, there are lots of form of potential bias. Being aware of the potential of these puts you on guard and decreases the chance of you falling victim. Deduction, as a form of heuristic (practical problem solving process), is a highly valid approach. But, as we’ve talked over, there’s a balance to be achieved, as we need a certain amount of induction (observing) before we start deducting (concluding). There can hardly be a fixed rule of how much induction is correct, it depends on the circumstances and complexity and availability of the data. The management of that deduction – induction spectrum is pretty much down to you.

Arguably, in policing culture, heuristics are common, and deduction is a prevalent approach. That fits quick decision and conclusion making, based on what appears to be, and supported by experiences of what’s been done before.  The awareness gained through being more inductive offers more strengths though when it comes to looking beyond what appears to immediately be, revisiting alternatives rather than repeating previous approaches, and spotting new, emergent differences.

Simons (2010) provides us with an entertaining experiment about selective attention, reinforcing aspects of the bias danger we’ve discussed. Watch this short clip:- https://www.youtube.com/watch?v=IGQmdoK_ZfY

The deductive paradox is that when you choose what to look for, the more you look and, hey presto, the more you find; as a form of amplification (Wilkins, 1964; Cohen 1972). Your attention gets fixed and you can easily make your mind up. Anything else then might be perceived as an irritating distraction from the ‘proper’ focus. The paradox is that you find more and more about less and less, and in seeing more and more you may be seeing less and less.

Comments

Comments are closed.