Interview with Sir David Spiegelhalter, Winton Professor of the Public Understanding of Risk at the University of Cambridge
Zan Boag: You said that one regret in your life that you hadn’t taken enough risks, that you’d been too cautious in your career and in your travels. You say, “I wish I’d done more adventurous things.” What part do you think that your job played in being cautious and not taking more risks? Or is this simply a regret that most of us are likely to have as we get older?
Sir David Spiegelhalter: I don’t think my job has hugely influenced my attitude to risk, it’s much more that my personality has led me to my particular job, and it’s also led me to my attitude to life. So they’re correlated but only because I think they’ve got a common cause. Correlation is not causation as we all well know. And the common cause is me.
Do you think there’s no possibility for change? We’re stuck with ourselves, are we?
No, no, not completely. And, on second thoughts, I think maybe my work with risk has to some extent influenced how I approach my life, as I do try to get a feeling of magnitudes, particularly when faced with a situation which asks for ‘thinking slow’. But in the end, how we feel about risk and threats is, I think, inevitably influenced by basic attitudes to caution and boldness. And I’ve tried to encourage my boldness by swallowing hard and going for it, jumping off high ledges into water and so on, sometimes walking on stage in front of a thousand people. You just have to do it even though your body is in a sense telling you not to.
But the more important thing is not, I think, your attitude when you are directly faced with a risk, it’s whether you put yourself in that situation to start with, and whether you open yourself up to the opportunity of experience. And I think as I’ve got older, I realise I’ve not always done that in a way that I wish I had.
We’re going to come back to a couple of different ideas about uncertainty, but I’d like to start with a definition of what you think uncertainty means.
This is tricky because risk and uncertainty can mean almost anything. For me, uncertainty is absolutely any situation that isn’t completely specified and known and determined, which essentially means every aspect of life.
That’s uncertainty; what about risk?
I don’t like defining risk because it can be used in so many different ways, and who am I to say what it means? For me it’s when you’re faced with a situation where things may go well and may go badly, and so you’re taking a risk. So it’s the way that unavoidable uncertainty in the future plays out.
As you said earlier, everything is uncertain when it comes to the future. We think we have a high degree of control over what’s going to happen…
And it’s not just the future of course, we’re hugely uncertain about what’s going on at the moment, what’s gone on in the past, and about how the world works. In fact, I’m less interested in uncertainty about the future than I am in uncertainty about the state of the world as it is and as it has been.
Yes, indeed. You mentioned in an interview that people overestimate the amount of control that they have in their lives. Do you think this is something that’s quite common for all of us?
Well, I’m not a psychologist but from my colleagues and the reading I’ve done, and just my personal experience, I feel people do overestimate the ability to control what happens. They search for meaning, they search for causes, they search for understanding about why things happen. I do think that it’s actually quite difficult to grasp the fact that much of the time, maybe most of the time, we can’t understand why things occur. They just happen.
That’s certainly something that’s come to the fore over the last few years. During this time, you and Anthony Masters drew on data to write a weekly Observer column that invariably centred on COVID statistics. What sort of response did you get from the general public about your columns?
Oh, enormously positive. And not just our columns, but throughout the two years of the pandemic I and other statisticians have seen it as their job to try to explain things to people. Not to say what should be done, not to say what policy is right or wrong, not to blame anybody for what they did, and not to predict what’s going to happen. By avoiding all those things which so many other commentators were obsessed with, there was a very strong role just for trying to understand and explain what’s going on. And this was deeply appreciated by audiences.
The media took some time to catch up on it. When I started doing media work on the pandemic it was just utterly predictable that I’d be asked, “What should we do? Who’s to blame? What’s going to happen?” And I’d say, “I’m not going to answer any of those questions, it’s not my job.” But I did try to explain what was going on, about the spread of the virus, positive tests, COVID deaths, trends in the data, and so on. I refused to engage in debates, speculate, or blame, and yet statisticians were very popular indeed. For me that’s a very positive lesson, and I hope it is for the media too.
It was quite a novel idea because what tended to happen in the past was that statisticians would present this information, and then the information would be interpreted, or perhaps I should say misinterpreted, by the media, by politicians, and by interest groups.
That’s why my book is called The Art of Statistics. It’s all to do with what you can learn from data, its appropriate interpretation. Just getting the stuff and showing it to people is enormously important and very challenging in itself, but it’s only part of the whole cycle. The people who are closest to the data know most about it and its limitations, so I feel they have a strong responsibility to help people judge what they can and can’t learn from that data, and not to leave it to others who may have their own agendas. And so, again, I and others have spent a lot of the last two years countering the misinterpretation of data that so many people have done, with their own motivated reasoning. They have got a view of how society should work, and they choose and interpret data to support their argument. And that goes on not just in politics and the media but among a number of scientists as well, which I think is deeply unfortunate.
I’ll come back to your book because I’d like to ask you about that as well, but just staying on statistics and the way they can be manipulated and misinterpreted – they haven’t had a great reputation. Perhaps Mark Twain can be blamed for popularising the phrase, “Lies, damn lies, and statistics,” but I think this it is really because politicians and companies will often use statistics to obfuscate. As a result, many people distrust statistics and the information they receive – fake news has been selected as the word or term of the year by several dictionaries. So, how do people cut through the spin?
This is the most important issue, which I now dedicate most of my time to. Communication is a two-way process – there’s two parties to this, there’s the people doing the communicating and there’s the people receiving – the audience. For the people doing the communication, there’s an enormous responsibility to use what we would call ‘trustworthy communication’. Those principles are well established – that you try to inform people rather than persuade them, and show a balance of evidence, both positive and negative. But not a false balance, not everything is equal. You are upfront about uncertainty and the quality of your underlying evidence, and also you try to pre-empt misinformation, you get in there and make clear what you can’t say on the basis of the evidence.
We’ve even got a prize that we’ve given for trustworthy communication. So we know it can be done, but it needs care and thought, and for people to call out when it’s not being done. And then to encourage audiences to take a sceptical view to the claims being made and that are being presented to them. But only sceptical, not cynical, because we need the numbers.
It has been all about numbers.
It has been all about numbers, and I’ve got to be honest, the experience of statisticians has been good. So it shows that the “Lies, damn lies,” nonsense just isn’t true, in that people do not have a totally cynical view of statistics. They want them and they appreciate them.
I’m not sure of the best way to improve the ‘data literacy’ in the population. I think a critical approach to data can be taught in schools, but in fact mostly this is just common sense. In particular, there are very general questions about, “How does this number make me feel? What’s my emotional response to it? Why am I hearing this? What is someone trying to do to me by presenting?” And to realise that many times when numbers are used, they’re being used to try to persuade you to change your emotional response – this is where the “Lies, damn lies,” thing has got some truth in it. In my talks on this I show that I can make any number look large or small, frightening or reassuring, just by changing the framing and context. I know other people can do it and so I just want to show people how to spot the tricks.
It’s interesting, you talk about our emotional response to things – most people wouldn’t expect to hear a statistician talking about an emotional response to numbers.
Oh, absolutely. Numbers convey and have got a lot of emotional impact.
You say that we can never take an objective view about evidence, that we always bring our personalities into it.
Yes, we should recognise we may have feelings about how the evidence has turned out, and then actively try not to express them in our communication. Except to express enthusiasm for the knowledge gained, and for people engaging with it.
And you see yourself as an optimistic person. In what way does being optimistic or pessimistic affect our decision making or our interpretation of data?
If I quote a number, and it’s got some range of uncertainty, then you might focus on the top or the bottom of the range. Alternatively, if say I am discussing climate change, I could say “The temperature change could be as high as three degrees,” or say, “It’s unlikely to be greater than three degrees.” Okay, exactly the same number, exactly the same scientific basis, but the story told in a completely different way. We have got used to thinking of these two processes: a cooler, rational, slightly mechanistic, quantitative way of analysis, in contrast to a warmer, emotional, spontaneous response. For me, they’re talking to each other the whole time, I don’t just click from one to the other. As Daniel Kahneman and others would say, that at times of challenge, of importance, when we’re really all faced with something that’s quite difficult, we should try to think slowly. Just try to think slowly.
Well, that’s the thing, in your recently published book you didn’t call it The Science of Statistics, you called it The Art of Statistics.
It’s a different approach. Statistics is usually viewed as a science – that these figures simply are what they are. What kind of message are you trying to get across by calling your book The Art of Statistics?
I started with a quote from Nate Silver in The Signal and the Noise where he says, “The numbers do not speak for themselves, we imbue them with meaning.” Statistical analysis is not some algorithmic process where you turn a handle and out comes an answer, and that’s it. Obviously, there are some algorithms, you apply formulae, you do analyses, do all that all the time. You code and out comes an answer. But that’s not the real answer, that’s just the numbers that have been calculated based on what you’ve done, and those numbers cannot tell you what actually is happening. We have to use judgment and skill, context and experience, and understanding in order to interpret the stuff on the bottom of the screen. In my old days, at the bottom of the printout, but now it never sees paper.
COVID has demonstrated this enormously – you can ask, for example, a very simple thing, “How many COVID deaths?” And I can go on to a website and count COVID deaths. Well, what does that mean? It means different things in different countries. Is it people who died because of COVID? With COVID? Do we even know? So it’s completely deluded to think these numbers are the end of it. Crucially, we have to know where they came from, and the context, in order to judge what we can learn. Data science is the whole process of learning from data, and not just some technical algorithmic process.
You say that we imbue numbers with meaning. Now I know in the 1980s you developed ways for AI to handle uncertainty – fast forward 35 years and AI is becoming more a part of our lives. Understanding that AI is developed by humans, do you foresee any issues with AI and data analysis over the coming years?
This is so important. To be clear, I’m not talking about ‘general’ AI, I’m just talking about algorithms that are applied to data and come out with some conclusion that then might be useful to people. But that’s the sense in which pretty well all so-called AI is being used. And there’s some wonderful stuff, even if it’s not always called AI.
But we know that if you’ve used a biased data set, the output will just reflect what you put in. Of course, that is now deeply appreciated, and has brought calls for ethical use of AI, transparency, explainability and so on. All of which, I think, is excellent, as it might stop people imbuing algorithms with some mystical abilities just because they are called AI. Some of the work is remarkable, but it always requires critical faculties to use appropriately.
You were an expert witness into the inquiry into the serial killer doctor Harold Shipman. I know this was some time back. You noted that Shipman could have been caught much earlier if someone had been looking at the data, lives could have been saved perhaps. Now, over the past few years data and death have been at the forefront of people’s minds because we’ve been grappling with COVID pandemic right around the world. How has data been used to save lives and is it possible to quantify how many?
The appropriate collection and analysis of data has saved countless lives. No medical treatment would be approved for use unless it was evaluated using statistical methods in randomised trials. Statins, vaccines, and so on have been evaluated using formal statistical techniques, regression analysis, P values, whatever. And statisticians have been at the very centre of all that work, and this has ended up saving vast numbers of lives. Let alone the work on determining the risks of smoking being run by statisticians and epidemiologists. So without that data collection, without that quantification, we’d be still in the age of medical anecdote, and it would be completely disastrous.
It’s only through rigorous data collection and statistical analysis that you can produce the evidence that’s convincing enough to make people like me take their statin every morning. I have no idea whether they will do me any good and I never will. However, if enough people take them, it saves huge numbers of lives. When I take my statin, I know it is lowering my cholesterol, but I have no idea that, if I don’t get a heart attack, whether it was prevented by the statin. I just know, because of the statistical analysis alone, that on average this benefits people like me. So statistics save lives.
Over the course of your life many different things would have happened to you – you have had uncertainty throughout your life, just like anyone else. Now experience tells us that life is uncertain by its very nature and that we don’t have much control over what the future holds, we discover this day in day out, but people still seem to have an aversion to uncertainty. Is there a way of overcoming this fear of not knowing what comes next?
Oh, that is such a good question. I feel I personally have overcome it to some extent, in that I embrace uncertainty in my life. I don’t particularly seek to control what happens. Of course I don’t like it when things go wrong, and I moan and shout, but in the end it happens. Now I don’t know whether that’s a function of just getting older, or the fact that I’ve studied this stuff for years, or my basic personality, or the shocks I’ve had in my life, and the fact that I have managed to live with them. So I have absolutely no idea why I feel this way but I’m glad I do – I think that an increased acknowledgement of the lack of control in our lives would be, in general, quite good.
Just to finish up, you wanted to say something about different types of uncertainty…
I struggle to understand uncertainty and the way to categorise it, although there’s been many attempts to do so. But I first think of uncontrollable uncertainty about the future. It’s sometimes called aleatory uncertainty, or chance – when you can’t know what’s going to happen. And then next you’ve got uncertainty about some state of the world. For example, if I flip a coin and cover it up, what’s underneath my hand? That’s epistemic uncertainty, it’s our lack of knowledge about a potentially verifiable fact, what we don’t know. But this is quite limited because it assumes you know what you don’t know, in Rumsfeld’s famous words.
So we need to go beyond that. And I think that Rumsfeld’s “Unknown unknowns,” is useful. What is outside your current ways of measuring and envisaging? It sounds a bit intellectual, but I like the phrase ‘ontological uncertainty’, meaning that the words and concepts you’re using are inadequate to describe what you’re trying to explore. This may mean possible futures you’ve never thought of, but also just the inadequacy of your measures. For example, we calculate statistics on wellbeing and ill health. Well, of course they’re not measuring whatever people mean by wellbeing and ill health, they’re just very limited metrics of it – there’s a huge amount left over. What’s left outside our conceptualisation and measurement, perhaps things we’ve never even thought of – that for me is the interesting area of uncertainty, because it acknowledges the limitations of everything we do with data. Which is suitably humbling.
Sir David Spiegelhalter has been Winton Professor of the Public Understanding of Risk at the University of Cambridge since October 2007. His background is in medical statistics, with an emphasis on Bayesian methods: his MRC team developed the BUGS software which has become the primary platform for applying modern Bayesian analysis using simulation technology. He has worked on clinical trials and drug safety and consulted and taught in a number of pharmaceutical companies, and also collaborates on developing methods for health technology assessment applicable to organisations such as NICE. His interest in performance monitoring led to his being asked to lead the statistical team in the Bristol Royal Infirmary Inquiry, and he also gave evidence to the Shipman Inquiry. In his post he leads a small team which attempts to improve the way in which the quantitative aspects of risk and uncertainty are discussed in society. He works closely with the Millennium Mathematics Project in trying to bring risk and uncertainty into education, gives many presentations to schools and others, advises organisations on risk communication, and is a regular newspaper columnist on current risk issues. Sir Spiegelhalter is Chairman at The Winton Centre for Risk and Evidence Communication at the University of Cambridge, a Fellow at Churchill College Cambridge, and Associate Fellow of the Centre for Science and Policy. He was elected FRS in 2005 and awarded an OBE in 2006 for services to medical statistics. He received a knighthood in the Queen’s Birthday Honours List in 2014 for services to statistics.