System 1 and System 2 thinking is a concept explained by Daniel Kahneman in his book "Thinking, Fast and Slow"; System 1 being "fast thinking" and System 2 being "slow thinking". His book summarises decades of research, which led him to win a Nobel Prize in Economics in 2002, much of which was conducted with his colleague, the late Amos Tversky, to whom the book is dedicated.
Essentially, Kahneman tells a story of two types of thinking that we use in our daily lives;
System 1 (fast) thinking operates automatically, intuitively and with little or no effort or sense of voluntary control. It cannot be switched off and is responsible for the vast majority of the many thousands of decisions we make on a day to day basis.
System 2 (slow) thinking is the type of thinking we can, perhaps, more easily identify with; it is the reasoning and considered choice-making that requires attention and concentration. However, it is slow and effortful and reduces our ability to carry out other simple tasks (try walking whilst figuring out a complex maths problem to see this in action). Therefore, System 2 thinking tends to 'run in the background' until it is needed and called into action, usually when System 1 is not equipped for the situation or self-control is needed.
System 1 (fast) and System 2 (slow) thinking have their upsides and downsides. Fast thinking keeps us functional by making the majority of decisions quickly and efficiently but it, necessarily, does not put the effort into fully understanding a situation or question and is filled with biases that impact the quick decisions made. Slow thinking requires time and effort and takes help from System 1 wherever possible so, even when making considered decisions with System2 (which is responsible for doubting and unbelieving) our decisions will be affected by the biases of System 1 which is built to believe. There are a great many biases that impact decision making, from confirmation bias (seeking out and putting more emphasis on information that supports your own preconceptions) to anchoring effect (the tendency to fixate on the first, or a particular, piece of information offered) and gamblers fallacy (the tendency to think that future events are impacted by past events when they are not linked). There is much written on such biases and they are not included in any further detail here but it is an entertaining aside to find out some cognitive biases and see when you can spot them in your colleagues (this in itself is the bias blind spot bias which is the ability to see biases in others but not yourself).
So, what does all of this mean and how can it help us better understand the decisions made by stakeholders in healthcare? Decisions made in healthcare, like decisions made in all other aspects of work and life, are made with contribution from both System 1 and System 2 thinking; they are, after all, made by humans acting in roles such as physician, payer or patient. Although, decisions made at an organisation level may be less prone to influence, since processes tend to be in place to limit the impact of human biases (however, where individual biases stop, group dynamics are likely take over). When we think about decisions being made about formulary inclusion or treatment pathways or individual patient prescribing, we often assume they will be rational, considered decisions. There is no doubt that reviewing the clinical and economic data of a new drug is going to require a considerable amount of System 2 effortful thinking. However, we should be aware that System 1 will also be at play, recognising what intuitively feels true (and dismissing or deprioritising what doesn't), building associations and unconsciously applying biases as the data is reviewed.
We probably know, reasonably well, the important factors that different stakeholders take into consideration in their conscious decision making, but do we understand the System 1 biases that have influence? How much of the conscious decision making is actually post-rationalisation to justify making the decision that 'intuitively feels right'? When we aim to understand the System 1 associations and biases that impact decision making we can better connect with the decision making process through different communication techniques. For example, System 2 tends to be lazy and System 1 loves a good cohesive story that intuitively makes sense; we can use this knowledge to develop compelling narrative (storytelling) techniques to build a consistent story that System 1 accepts intuitively to be true and that System 2 is comfortable to endorse. To be able to do this you have to first uncover the System 1 thinking intrinsically linked to the decision making process of customers at all levels.