communicating when risk is unclear

Coronavirus has thrust risk communication into the spotlight. But, when risk is unclear, how do you communicate without causing confusion?



cropped coronavirus-4817450_1920.jpg

The media coverage of the coronavirus outbreak has highlighted some interesting communication challenges when dealing with something that is highly technical, outside of the realms of understanding for most people and potentially very dangerous.  The language used and the way data is presented has a significant effect on understanding and if we look at how the story has developed it’s easy to be see why confusion can arise.

In January, the coronavirus was expected to have a ‘low fatality rate’ of 4%¹ and was likened to a ‘mild infection’. But a month later, despite the mortality rate reducing to 1%, the virus could potentially result in 50 million deaths² , a situation that sounds very different to a ‘mild infection’.

And regarding levels of ‘risk’, at the beginning of February, following increased number of cases, the UK’s Chief Medical Officers (CMOs) raised the risk level to the UK from ‘low’ to ‘moderate’. However, at the same time they created something of a logical black hole by insisting that the risk to individuals has not changed and remains ‘low’. ³

These examples show how confusion arises as numbers on their own are pretty meaningless and misunderstandings can occur especially when different types of data are being used that do not compare like-for-like e.g. number of cases versus percentages.  Even comparing the same type of data such as the fatality rates of 4% to one of 1% is unhelpful without knowing additional context such as sample sizes  - in January, there had only been 600 confirmed cases whereas now there are 45,000 confirmed cases, over 1000 deaths and the virus is now known to be more contagious than originally thought.   

Language can also mean different things to different people and with the CMOs, language can mean different things to the same people depending on the context.  In their defence however, by increasing the UK wide risk level, the CMOs were enabling the government to escalate precautionary planning even though the material risk to UK citizens was unchanged.  

While these examples were ‘cherry picked’ to highlight the point they do show why additional information and context is vital when communicating risk as it allows a more accurate interpretation of the data.

But even when not trying to deliberately bamboozle readers, the way data is presented can have major impact on what is understood – even by experts who deal with numbers every day.

NUMBER BLINDNESS

In one famous study, psychologist Gerd Gigerenzer investigated how doctors communicate the results of breast cancer scans to their patients⁴ .  Gigerenzer gave two sets of doctors the same facts but presented in different ways and asked both groups the same question, ‘What is the probability of a woman who has tested positive for breast cancer actually having cancer?’  The results were surprising or concerning depending on your point of view with around 80% of doctors in one group getting the answer correct and roughly 70% of the doctors in the other group getting the answer wrong.  See how you get on:

FACTS

  • The probability that one woman in the age group has breast cancer is 0.8%

  • If a woman has breast cancer, the probability she will test positive is 90%

  • If a woman does not have breast cancer, the probability she will test positive is 7%

QUESTION

  • If a woman has a positive mammogram, what is the probability she has breast cancer?

cropped medical-781422_1280.jpg

The answer to the question is about 9%.  And as you may have worked out, this was the information given to the group where most of the doctors estimated incorrectly – the answers given ranged from 1% to 90%.

The group who were able to answer the question more accurately were presented the information in the following way:

FACTS

  • 8 out of every 1000 woman have breast cancer.

  • Of these 8 women with breast cancer, 7 will have a positive mammogram.

  • Of the remaining 992 women who don’t have breast cancer, 70 will still have a positive mammogram.

QUESTION

  • If a woman has a positive mammogram, what is the probability she has breast cancer? 

Presenting the facts in this way using ‘natural frequencies’ is easier to understand as the same group of 1000 women is being referred to at each stage.  It then becomes much easier to see that 7 out of the 77 women who tested positive will have cancer – approximately 9%.  There is a much fuller explanation of this study in Gigerenzer’s book ‘Calculated Risks: How to Know When Numbers Deceive You’ which has some other great examples of how numbers can be misleading depending how they are presented.

And to go back to my original question regarding the best way to communicate the risk regarding corona virus, the answer is in a way that informs without scaremongering. The usual communication rules apply in generating trust with your audiences by being transparent and open especially when details are vague. And an effective way to generate trust is by presenting numerical data in a way to that ensures more people can correctly interpret the data and understand the risk without panicking.

 

References

¹New Scientist.  (24 January 2020).  What are the symptoms of the new coronavirus and how deadly is it?

²New Scientist.  (11 February 2020).  Could the new coronavirus really kill 50 million people worldwide?

³https://www.gov.uk/government/news/statement-from-the-four-uk-chief-medical-officers-on-novel-coronavirus (30 January 2020).

 ⁴  Gigerenzer, G. (2003). Simple tools for understanding risks: from innumeracy to insight. BMJ, 327(7417), pp.741-744.  

 

Niklas Pettifor is owner of Pettifor & Pettifor, specialists in corporate communications, issues, crisis management. Get in touch. niklas@pettior.com   www.pettifor.com