Preview Mode Links will not work in preview mode

    Use the buttons below to subscribe via Apple, Spotify or Google Podcasts.

Confronting Uncertainty with Tamsin Edwards - Transcript

May 21, 2020

For leading climate scientist Dr Tamsin Edwards, probabilities and possible futures are part of her everyday work. She joins David to explore uncertainty in different aspects of her life: as a statistician and mathematical modeller, as a communicator about possible climate futures, but also as a cancer patient faced with a life-changing decision.

***

David: In this episode, I’m speaking with Dr Tamsin Edwards all about confronting uncertainty - as a climate scientist as a communicator, but also as an individual in her personal life.

This is a more intimate episode than usual and we weave together some different threads. We cover how Tamsin has dealt with communicating uncertainty in climate science and how she found listening to the perspective of the audience including those with deeply sceptical about climate change very important in learning how to communicate, but we also explore Tamsin's personal experience after a medical diagnosis forced to confront serious, but uncertain risks to her health.

Tamsin welcome to Risky Talk

Tamsin: Hi happy to be here!

David: Now you and I are both ‘uncertainty nerds’! People often think the science delivers cold hard facts that everyone agrees with, but what place does uncertainty have in your work?

Tamsin: I think the best way I've heard it put is that uncertainty is the engine of science. we couldn't do science without it. It's the idea that we are at the bounds of knowledge is both why we need to do the science and what's exciting about it.

"Uncertainty is the engine of science"

I got interested in uncertainty kind of right from the start of when I went into climate modelling, actually through joining a project which had a statistician involved - which was perhaps relatively unusual at that time – through a mutual friend of ours called Jonty Rougier, and I just got I got kind of bitten by the bug really. Of uncertainty, of exploring possibilities of being kind of open to not just thinking about the things that we didn't know, but trying to anticipate the things that might happen that we haven't yet, you know, the unknown unknowns and all of that, and the almost kind of philosophical aspect of trying to predict the future when we have all that uncertainty in how something behaves in something as complicated as the planet with a relatively limited amount of data to base it on and there's always going to be kind of limits to our knowledge. So I yeah, I just really kind of fell in love with the subject. 

David: You specialise in developing models for climate systems, but your blog is called All Models Are Wrong. Okay! Why did you call it that?

Tamsin: Well, some people may know there's a famous quote 'all models are wrong, but some are useful' by the statistician George Box.

David: It’s got its own Wikipedia page!

"All models are wrong, but some are useful."

Tamsin: Which has its own Wikipedia page! So I stole it from that. And it came about actually, I was always interested in that quote and I was already working in the idea of uncertainty, but it came about through some Twitter conversations about how one should communicate climate science, I guess to the public and climate prediction.

There was this kind of split view around the idea of kind of trying to increase confidence in in climate predictions by emphasizing all the things we were really certain about and the stuff we were really sure about you know. There's all this kind of very hardcore well-known well-established physics at the at the core of these big climate models and emphasizing all the ways in which we think very carefully about the models and we test them and we you know, put lots of work into that versus the risks of doing that too comprehensively and saying we were sure to the point of absolute certainty and emphasizing scientific certainty and what dangers that can lead to in terms of when the evidence-base does change, when our understanding changes, when we have different predictions coming out in the media that might appear to be contradictory when we understand and discover something new.

We then put ourselves at risk of people saying -  but we thought you were sure about that but now you've changed your mind!?

And actually as well as being interested in uncertainty for my research. I'm a firm believer in communicating the idea of uncertainty in cutting-edge research to the public to try and help resolve some of those difficulties around apparently contradicting and changing evidence -  because evidence is apparently contradictory and changing because that's the nature of uncertain knowledge!

David: In a previous podcast we did on climate change, one of our guests Anthony Leiserowitz said there's a double standard: that somehow people and politicians expect us to be so certain about things like climate change, and yet when you think about policies like Brexit or anything like that - actually there are huge uncertainties! And everyone seems to be able to accept those. What do you feel about that? 

Tamsin: Absolutely. I mean, obviously we're always making decisions under uncertainty, but I don't know how much I've seen people really saying you need to narrow your error bars on a prediction, something as explicit about uncertainty of that. It kind of runs much more deeply.

I think that it's much more about the fundamental kind of tenets of what you're saying not about “oh, we'll wait until you can predict global mean temperature in the year 2100 to within 0.1 degree Celsius, as opposed to, you know, one point four degrees Celsius. It's much more about well the assumptions that go into that and the testing and the potential for groupthink and cognitive biases. So I don't normally see it as “reduce your uncertainty”. It's much more kind of fundamental than that.

David: So it's not a matter of the width of the error bars, it’s more people thinking, “Do you even know what you're talking about?”

Tamsin: Yeah. Do you even know what you're talking about, or are you lying with some agenda? Are you manipulating the results in some way? This kind of thing? And yeah, so and so are the kinds of questions I get asked I guess when I give a climate talk from people who kind of lean a bit skeptical, or get asked on Twitter ,isn't sort of oh, well, your prediction isn't credible until you narrow your error bars. It's about will have you forgotten that this other thing might drive climate or have you added some bias correction to the data that has made it, you know systematically wrong, that kind of thing.

David: Yeah. So it's to do with systematic errors due to not including an important factor or just your model being wrong, which as you said, “all models are wrong!”

Tamsin: Exactly, right? And that's all I really wanted With the blog name and the blog feed, you know, as well being about uncertainty to kind of confront that head-on. So it was partly about how do we come up with the error bars? Because that's kind of interesting and useful to talk about but also it was about how do we test those assumptions? How do we evaluate our models? How do we incorporate knowledge and check for systemic biases group think this kind of thing and really actually say look I want to talk about uncertainty and I want to examine the thought processes, and I want to lay it all out in the open so that we can have a conversation about you know to make it transparent about how these decisions are made about which things were more confident about and that have been sort of tested very thoroughly, and which things are more difficult because there isn't the data or we don't have such good understanding.

David: Okay. Can I ask you more general question? What do you think at the moment with the way that all the discussions going on about Covid 19, about which there's quite a lot of, you know, basic disagreement about what should be done. And is a highly skeptical movement as well. Do you think that, you know, the uncertainty about the science is being communicated well or not?

Tamsin: I think It's been good that the UK government have sort of been placing chief scientific and medical officers in the briefings public-facing every day and been trying to sort of include some of that quantitative data on the predictions, and they've been trying to make it clear that the mortality rates don't include everything and have some time lag. So there's definitely probably better efforts to make things clearer than I feared there could be but at the same time there's a lot of things that I think are not very clear.

Lots of people seem to be very surprised about the time scale likely for these lockdowns, this idea that we have a review every three weeks that people are kind of getting their hopes up about the lockdown stopping every three weeks. But when I had a look at the original Imperial report by Neil Ferguson team, it was very clear that, you know, the assumptions in those initial predictions that drove, as I understand it, that initial lockdown was several months for the first lockdown, followed by a lifting, followed by another lockdown, followed by a short lift, followed by another lockdown-  and this would go in and out of lockdown based on the Intensive Care capacity for perhaps two years, with a majority of the time in lockdown!

So that seems to me a really key piece of information that hasn't been communicated and it's not that that's a policy but it's obviously part of the evidence base that has gone into the policy.

David: Exactly and I do feel that the you know, the critique and discussion about all these things has not been that great to be honest, and the journalists have almost seemed to be more much more concerned with blaming people rather than actually looking at examining what people are saying and critiquing the data and the assumptions being made and so on.

Tamsin: Actually, there's a parallel with climate science that… well for a start the fact that predictions do get revised as we get more evidence and more data in, so that's not such a flip-flopping or some kind of weakness. That's a good thing. You know, we should be changing our predictions as new data comes in, but also that particularly it's sort of relevant to climate science the difference between I guess a scenario of a particular policy intervention, versus a sort of uncertainty range for a particular policy intervention or particular model or a particular study. So what I mean by that is that original Imperial study had the sort of predictions for no policy at all versus a sort of partial suppression, versus sort of full lockdown and those were very different numbers, but they would sometimes get communicated as “ They can't make up their mind if it's this number or that number”, or another study would come along with a different number and they say “well, you know, these people can't make up their minds” - but they were comparing two different scenarios! So I think there's a subtlety around that that idea of making predictions under particular scenarios. 

David: I agree that there's a difficulty in communicating some conditional predictions. As if  that's your prediction of what's going to happen. No! That's the prediction were this to be the case. Something that within science, we’re so used to talking about these conditional scenarios, what-ifs and which when it turns into communication it becomes that's the prediction. It happened recently with the Office of Budget Responsibility making the dire, you know forecast-  it wasn't a forecast it was a scenario for the possible economic harms of the lockdown and they explicitly said “this is not a prediction” and everybody said “that’s what they predict is going to happen.”

 Tamsin: Absolutely. It is about partitioning the uncertainty isn't it? I mean in climate change the equivalence is the scenarios of different greenhouse gas emissions in the future. So , you know, as climate scientists, we cannot predict what the future policies will be over the next century. So we make predictions for the high greenhouse gas emissions for the medium and the low conditional on those emissions - and that's because we can't, you know, we can't predict those. But what we can do is try and predict how the physical system of the planet and the atmosphere and the oceans will respond to a given scenario. But again people will say how can you how can you predict what people will do or emissions will be but it's a different thing. 

David: So what we're really concerned with here is the communication. You're one of the lead authors of the next intergovernmental panel on climate change  - IPCC - report, which will receive extraordinary attention.

What's the point of the report and how well do you think they have communicated their work - particularly in terms of the uncertainties the possible scenarios and so on?

 Tamsin: The IPCC assessment reports started in 1992, and they really gather the evidence base for the world's policymakers to think about climate change. Every sort of five to seven or eight years we've had new assessment reports, and the next one is going to be the sixth assessment report. And these things are now enormous dense, volumes of information - and they don't even capture every part of climate science! They're just really a sample of the key changes and they tried to explain and contextualize, you know, what we've learnt and what the impact of different policy interventions would be on climate change.

The idea is to be ‘policy-relevant not policy prescriptive’ is the phrase, so they present the predicted climate change under different greenhouse gas emissions.

There's been an interesting change in how we think about those predictions in recent years because of the Paris agreement setting things up in terms of a temperature target. So we often used to think about greenhouse gas emissions or concentrations particularly as a limit -  We've got to keep to 550 parts per million of CO2, something like this and you may notice that we don't really think in those terms anymore. We think much more in terms of the probability of keeping to the 2 degrees warming, or the one and a half degrees of warming since the pre-industrial period. And, of course ,that's a different way of thinking about it. You have to map out all the possible ways in which you could limit greenhouse gas emissions in a way that would then lead to only two degrees of warming, or less or say a 2/3 probability of that  - not even a hundred percent probability! So actually instead of thinking of a forward direction, of if we have high or low greenhouse gas emissions, what would the temperature be? The new reports the special report on one and a half degrees? Warming was a lot more about “what would greenhouse gas emissions need to be to maintain a warming limit of this”. 

David: Which I suppose does make it relevant to an outcome measure. That is the one that people are basically interested in.

Tamsin: Well a very simplified metric. Obviously, I mean global mean temperature is a incredibly kind of coarse way of thinking about climate change, but it's a useful sort of ballpark metric of change. I guess. It's a summary of change that doesn't incorporate everything but it's a scale on which to kind of act on.

David: So one of the things that we bring up in every one of these podcasts is what's the purpose of this communication of the science, and the uncertainty about the science. And to put it really quite crudely - are you trying to inform people or are you trying to persuade people of the importance of action over the threat we’re facing? And I think, you know, it's been claimed within climate science those two things have sometimes got rather muddied, which one you're trying to do. How do you see that balance?

 Tamsin: It's a really good question. I have been I've been involved with it IPCC just for a couple of years, and they are very much, as I say, almost their hallmark is ‘policy-relevant not policy prescriptive’. They are not aiming to persuade. Now you could argue that some of the communication around some of the recent special reports has been a bit sort of ‘if we are to do X. we must do y’ so, that's a sort of that still a kind of a conditional framing. If we are to meet the Paris agreement temperature limit of one and a half degrees, then we need to reduce greenhouse gas emissions quickly. If we are to preserve Arctic sea ice or the natural, you know, the land that we have in the forest around the world, we need to act quickly. So that's still a kind of a conditional, you know, there's an assumption that we want to, keep to the Paris agreement or that we want to preserve Arctic sea ice or that we want to preserve the world's forests. So I think that's a bit that's a bit more straightforward because they're still a kind of a conditional sort of persuasion if you like.

And what I have been doing a bit more in recent years is trying to work out how to give scientific advice to Extinction Rebellion, who’s communication goals are not the same as mine, I would say. In the sense that, I mean, my own particular goals as a climate scientist are to increase public understanding of climate science and the risks of climate change without advocating particular action. So I try quite carefully to preface “if” we're to keep to the Paris agreement, then it implies this or requires this, and to make clear that those risks are important and real  - and that's why people might want to take action on climate change.

For Extinction Rebellion, obviously, their aim is very much to get people out on the streets, to put pressure on politicians often. They have particular goals like achieving net zero in a particular year, for example, they have as one of their stated aims. And so to that end they will often emphasize the worst case predictions. To kind of give emphasis - and they're not really interested in communicating the full range of climate predictions, or the full range of uncertainty, but to use the worst-case to motivate action. And I I'm not uncomfortable with that as long as they're clear about that, and so that's the advice I try to give them, I try to sort of say well it's okay to say x as long as you give some context that this is, you know, really the highest end. Or, often the things that get passed around the activists are quite off-the-cuff comments and back-of-the-envelope sort of informal comments made by scientists, and sometimes not even scientist, but sort of related people and again, you know, that's not so bad as long as you make it clear that that's what it is that it hasn't come from a big peer-reviewed body of evidence by all of climate science.

David: But do you think they listen to you?

Tamsin: I would say it's mixed. I would say some definitely do and some people less so. So I think some people don't want to mess it up. They don't want to present science that's wrongm because it does damage trust in what they're doing and it is a problem if they get called out on it, and I have done that, I have you know publicly said when I thought they were wrong and that's not good for them, right? They want to avoid that situation. But on the other hand, there's always a subjective line between how clearly and transparently they are communicating that additional context. 

David: You're very special, I think, because you both have engaged with the activists campaigning against climate change, but also you're engaged with climate skeptics in a way that I don't think I could have the patience to do. You know, what's your, what's your approach? And you know, I think you’ve said something that - we all would like to know what to say when we meet a skeptic, but you think the better thing is to say, you know, what should you ask a skeptic? What do you mean by that?

Tamsin: Yeah. Well, I think people always say, how do I persuade climate skeptics, or what should I say, and I, you know, the quickest way I can condense what I think into one sentence is to say: it's not what you should say. It's what should ask. Where did you get that information from? Why do you think that? Why do you not trust the climate scientists and it's and it's you can learn a lot about you know, was it some trusted friend? Was it some particular article they read? Was at one particular thing that didn't ring true for them that set them off down that path? And, you know, and other thing I would say is very much to make it a bit more nuanced than where I am the person with all of the truth of the good information and I'm let me tell you all of it and dump it on you!

David: All the letters after your name!

Tamsin: Yeah exactly, that argument from authority. And sort of assuming that everything they say is wrong or has bad, you know this phrase, ‘bad faith motivation’, but actually to acknowledge that usually it's much much more complex. And that your typical kind of climate skeptic or contrarian or whatever has a bunch of stuff that they are understanding correctly, and agreeing with you on climate science. And then there are other things that they don't ,so it's really important to partition that rather than just a blanket, sort of, “You’re wrong, let me tell you how you're wrong.” 

David: You seem to exemplify something that we've said before on the on these podcasts which we always repeated at the Winton Center -  which is that first rule of communication is to shut up and listen. Listen to who your audience is, where are they coming from? Why are they thinking what they’re thinking?

Tamsin: Yeah. I'm not always successful, you know on a bad day and I'll be as grumpy as anyone but yeah, I think that's it. And I've been trying to think about where that why that instinct comes actually where it sort of came from and they're the best conclusion I've come from is that my dad was a therapist. No.

David: So you follow the good therapist, sort of cliché of saying ‘tell me more’?

Tamsin: Yeah, well sort of! And I don't mean that to sound patronizing, but it's much more about of just thinking -  well, everyone has everyone has a reason for acting the way that they do, and everyone has an internally consistent set of reasons for believing the things that they do, and it’s about understanding the root of that and why it diverges from your way of thinking, while trying to hold in your mind the possibility that you might not have the correct way of acting and believing in things. Now, often, you know, as a professional climate scientist who has spent many years studying this, I'm not saying I always think that every climate skeptic could be as right as me, but it's about it's more of a mindset of saying, well to this person everything they've read and everything they've understood has entirely confirmed this particular set of beliefs about me and my science, rather than kind of blunder in and say well you're an idiot, you know, what's your agenda? You know - try to understand how it is that they can see you in the same way that you are seeing them. I guess that's what I'm trying to get at.

 

David: Okay. I'd like to make a change of topic but I don't think it is such a big change because you've discussed, really, how you bring your personality and your humanity into dealing with climate scientists, the uncertainty and the skeptics, and I hope you don't mind me asking about your experience with cancer. It may seem like a just a massive jump, but to you, I don't think it is and don't think it is either - because we're dealing here with uncertainty but on a more local, personal scale. And so I'd like to ask about that, how your scientific approach to uncertainty, to acknowledge what we don't know, has influenced your approach to the experiences you've had with your disease.

Tamsin: Oh hugely.  I was diagnosed with bowel cancer just over two years ago, after sort of being ill with kind of serious IBS type symptoms kind of battle symptoms. So I ended up with emergency surgery - half my colon taken out - and chemotherapy for six months during most of 2018. And I’m two years or clear now, I'm not having any treatment so that's all good.

But there were so many interactions with my work and that cancer, and I don't know if it's the same for other scientists and statisticians who have had similar experiences. Statistics, risk and probability feel very different when you're at the sharp end of it, when you're trying to talk about it in a in an abstract way.

And of course, that's the problem with climate change, isn't it? Climate change feels very distant. It's a long time in the future often. The worst people are affected far away from us in developing countries and so on. And so when you're at the pointy end of probabilities. It does feel different, and I found loads of aspects of it interesting. I found I got surprisingly muddled about my own statistics, because of sort of chemo brain and fear and motivated reasoning of wanting to believe the good statistics and ignore the bad and all of that stuff. I suffered. And I was interested to know to find out that I had personal risk thresholds that I thought were acceptable. For example sort of 10% or less chance of recurrence of the cancer I felt was a sort of manageable, but 50/50 was very much not manageable. And my own personal probabilities lay kind of somewhere between the two, roughly halfway between the two.

And I was also interested in the way that there was a trade-off in risk of the treatment that I had to incorporate. So there was the added chance of survival through chemotherapy, which was not quite as high as I thought it might be compared with the surgery - that did most of the work, just chopping the tumor out did most of the work -  and then the chemo is just a bit of extra bonus on top of that. But of course the chemo is a very comprehensive poisoning, and has its own side effects of which some can be permanent.

And the one that they particularly focus for my treatment was neuropathy, or nerve damage in your fingers and toes. and it’s a cumulative effect - the more sessions, the more cycles of chemotherapy you take of my kind, the more likely becomes the permanent nerve damage. And it can be life changing. It can mean you’re unsteady on your feet, or you can't do up buttons, or you know, it can be quite difficult if you take it for a long time. And so trading those probabilities.

Now remember think I remember really noticing the change in how I felt about those probabilities. From the start when my fear was only about my mortality, so I thought ‘I'm going to take the full six months. No questions asked , I don't care what happens to my fingers and toes. I just want every little point one percent of survival probability that I can get’. By the time. I got towards maybe four and a half to five months into the chemo when the damage to my fingers was becoming more apparent - and you have to slightly predict. It's not real-time. You know, it's a, it's a trajectory that they have to sort of predict. And as it starts to get worse and you feel the physical sensations, the very weird alien changes in your body and think well, these are only going to get worse. Even if I stop the chemo now they could be permanent or they could certainly last for years. I found myself actually, you know, in tears in the oncologist’s office and he said ‘let's stop that particular treatment’. He said it won't you know you taken most of it. It won't make very much difference to your survival. It could make a big difference in your neuropathy and I stopped it.

And so that was a change in trade-offs as I went through. Yeah, but I've always tried to be quite open about all this stuff. I think it's interesting to draw those parallels as well with, you know, lots of people see risks in climate policy and it is it going to affect their quality of life. Is it going to damage the economy? So trying to draw those parallels between the long-term benefits versus potential short term difficulties of action, it was kind of helpful

David: How - I asked this both professionally and personally, because  - did you know? – I had prostate cancer and radiotherapy, - but I have never have chemotherapy, and I think that's the one I really would be starting to think about those trade-offs.

At our Winton Centre we put the front ends on risk calculators for women with breast cancer, men with prostate cancer in which we give the probabilities of increasing your 10-year survival if you have chemotherapy. And we’ve got numbers on that. We haven't got numbers yet on what the side effects might be either short-term or long-term. They have to be described much more qualitatively and people have to make their own trade off judgments. Do you know in Cambridge, for example, they’re quite clear that if there's less than a 3% increase in overall survival at 10 years they don't recommend chemotherapy. And you know at five percent they would recommend it, and in-between it depends very much on that on the individual. So, you know, it's recognized that you should demand quite a lot before you would be willing to undergo that.

 Tamsin: Yeah, there was a similar trial in bowel cancer. A big european-wide trial around those trade-offs, and they decided that for a lot of sub stages of bowel cancer they would only give three months of chemotherapy with a very low risk of permanent neuropath, but I was unfortunately just in the higher risk category, so they basically recommended the six months.

But I do think that it seems - and I may be wrong - but it seems as if there isn't always the joined up evidence available for that. So for example, the other place I talked about these stories was on More Or Less, and actually my main oncologist who had presented with me with all these statistics was also on the sho,w and he said it was actually quite unusual to see a patient so long after their treatment had finished and to be able to ask them ‘how is your neuropathy now?’, you know, six months or a year after their treatment had finished.. So although that evidence may be gathered through other methods, still to actually have that opportunity of saying, you know, how does it feel ,how do you feel about your neuropathy, they don't have the chance because you're out the door as soon as you've had the treatment, you know that they can't take an interest because they have to move on to the next patient.

David: You and I want these numbers. We would like to see these probabilities, even though as you said, you know how we read them most is hugely influenced by our current emotional state, and both about the potential benefits and the potential side effects. Do you think this is something that should be routinely available to patients, even those who are not obsessed with these ideas as we are?

Tamsin: Well in my very first session with my oncologist here, he gave me the full PowerPoint that he gives to his medical students because he realized quite the level of numbers that I wanted. He did stop short of emailing me the power point. He said I couldn't have a copy, but obviously they have to make very quick judgments about what information people want and can cope with.

David: Yeah. I mean, I I'm so pleased to talk to you about this because I know after my experience I feel enormously motivated that everyone should have this kind of information available. In numbers as well, and presented in the best possible way. And maybe people just don't want to see it, don't want to know, but I feel your experience is extremely encouraging

Tamsin: It’s a very difficult and a very subtle trade-off isn't it. Your short-term feelings are all about mortality all about survival and in a way your quality of life afterwards is secondary. But there's obviously a risk level at which it becomes more important. So for example, if your chemotherapy was incredibly damaging and toxic and only made a small impact to your survival and you are basically likely to survive anyway, you might think it wasn’t worth it. But if you had a very low chance of survival, you just take it no matter. 

David: There’s an important point there, that even if somebody doesn’t want to see the numbers, by talking like that you can illustrate there is a choice, there is a decision, there is a point at which you would flip. And so it makes you realize that actually I think it should empower people to say, well now I can choose - on which side of that of that threshold do I lie?  

Tamsin: And of course it's about expectation and support isn't it? I mean the NHS and the cancer charities, it's hard for them to keep supporting you for years and years afterwards, especially if you're not even undergoing any treatment, so it's really about managing expectations. Even if you would make the same decision either way, it's about knowing that the next few years could be quite severely affected in that way.

David: So just finally, something I've always interested in as a statistician, I would like to put things in numbers, but what we've been talking about today - so much of it we're not able to put into numbers, because a lot of its to do with fundamental lack of knowledge or you know, where our ability to predict just isn't good enough. How do you feel, then, about putting things into numbers versus putting things giving a qualitative feel for how bad something might be? 

Tamsin: Yeah good question. I think I think hooking things on a few numbers can help. I think I mean one of my great feelings about science communication is that, I'm a great believer that it’s always possible to communicate something complicated. It's just hard. It just takes more time to think about how to construct that in a way that is both approachable, but also accurate. And has some kind of echo, or pointing towards the caveats or the simplifications that you're making. And so it's not about a sort of numbers versus not numbers. It's about it's about almost like a two-layer communication. It's about giving those key points while not trying to give the impression that that's all there is to the story, it’s saying well, it's a bit like this or you know, there are some other factors involved or here's the way we I can think about it broadly speaking and those are the kinds of phrases you can use.

David: And you use the word ‘story’, and I think that is absolutely the right thing. We're talking about narratives that are engaging for people and that can combine those elements of something perhaps fairly hard and numerical and a much broader perspective.

Tamsin: I think there's been a great kind of a great leap in understanding in last few years certainly in my field but in lots of fields about the importance of storylines and narrative.

"Because it's not just about sort of helping people to understand, it's about exploring possibilities isn't it, imagining the future. It's about putting ourselves in the place of, this future or that future."

David: Tamsin Edwards, thank you so much indeed for this wonderful conversation, which I've enjoyed a lot and I've got a huge amount out of, and I hope people listening to it will have too, so again Tamsin thanks so much.

Tamsin: Thank you so much for having me.

***

Connect with Tamsin on Twitter and check out her blog