
Living With AI Podcast: Challenges of Living with Artificial Intelligence
Living With AI Podcast: Challenges of Living with Artificial Intelligence
Exploring Track & Trace and *that* App
This Pilot Episode was recorded on 12th October 2020
0:25 - Gopal Ramchurn
0:40 - Christine Evers
0:54 - Paurav Shukla
1:25 - Sean Riley
1:25 - Computerphile
2:12 - Joel Fischer
2:55 - Test & Trace NHS App
4:41 - Contact Tracing (Computerphile)
10:00 - Disappearing NHS App Messages (BBC)
11:15 - Excel 'scandal' (BBC)
11:30 - Test Results incompatibility
12:10 - NHS Covid-19 App downloaded over 10 million times (UK Gov)
15:20 - Apple Watch Health features (Which)
16:30 - UK abandons centralised Covid-19 App (Tech Crunch)
19:30 - Government report warned pandemic planning was confused (Independent)
22:40 - Contact Tracing (Computerphile)
24:40 - Trustworthy Autonomous Systems Hub
30:00 - Protect the NHS advertising campaign (ITV) April 2020
36:55 - It'll be over by Christmas (History Net)
42:20 - Volunteering my be good for body and mind (Harvard)
42:52 - Olympics - London 2012 (Wikipedia)
44:55 - Lack of access to contact tracing app could leave elderly more vulnerable to coronavirus (Telegraph)
Podcast production by boardie.com
Podcast Host: Sean Riley
Producer: Louise Male
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Podcast Host: Sean Riley
The UKRI Trustworthy Autonomous Systems (TAS) Hub Website
Living With AI Podcast: Challenges of Living with Artificial Intelligence
This podcast digs into key issues that arise when building, operating, and using machines and apps that are powered by artificial intelligence. We look at industry, homes and cities. AI is increasingly being used to help optimise our lives, making software and machines faster, more precise, and generally easier to use. However, they also raise concerns when they fail, misuse our data, or are too complex for the users to understand their implications. Set up by the UKRI Trustworthy Autonomous Systems Hub this podcast brings in experts in the field from Industry & Academia to discuss Robots in Space, Driverless Cars, Autonomous Ships, Drones, Covid-19 Track & Trace and much more.
Season: 1, Episode: 2
Exploring Track & Trace and *that* App
This Pilot Episode was recorded on 12th October 2020
0:25 - Gopal Ramchurn
0:40 - Christine Evers
0:54 - Paurav Shukla
1:25 - Sean Riley
1:25 - Computerphile
2:12 - Joel Fischer
2:55 - Test & Trace NHS App
4:41 - Contact Tracing (Computerphile)
10:00 - Disappearing NHS App Messages (BBC)
11:15 - Excel 'scandal' (BBC)
11:30 - Test Results incompatibility
12:10 - NHS Covid-19 App downloaded over 10 million times (UK Gov)
15:20 - Apple Watch Health features (Which)
16:30 - UK abandons centralised Covid-19 App (Tech Crunch)
19:30 - Government report warned pandemic planning was confused (Independent)
22:40 - Contact Tracing (Computerphile)
24:40 - Trustworthy Autonomous Systems Hub
30:00 - Protect the NHS advertising campaign (ITV) April 2020
36:55 - It'll be over by Christmas (History Net)
42:20 - Volunteering my be good for body and mind (Harvard)
42:52 - Olympics - London 2012 (Wikipedia)
44:55 - Lack of access to contact tracing app could leave elderly more vulnerable to coronavirus (Telegraph)
Podcast Host: Sean Riley
Producer: Louise Male
Podcast production by boardie.com
If you want to get in touch with us here at the Living with AI Podcast, you can visit the TAS Hub website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub Living With AI Podcast.
Episode Transcript:
Sean: Welcome to the Living with AI podcast where we discuss the challenges of living with artificial intelligence, how will it impact society, personal freedoms and our general well-being. Today we're going to talk about track and trace and that app. On our panel today we have Gopal Ramchurn, a Professor of Artificial Intelligence at the University of Southampton and director of the newly set up UKRI Trustworthy Autonomous Systems Hub, the UK's flagship research programme on trustworthy artificial intelligence.
Christine Evers is a lecturer in computer science at the University of Southampton. Her research is focused on machine listening, equipping robots and autonomous agents with the ability to make sense of sounds. Her hobbies include road cycling. Paurav Shukla is a Professor of Marketing and Head of Digital and Data-Driven Marketing at the Southampton Business School. His research highlights the hidden meanings and associations embedded within consumption practices across cultures and offers novel insights for researchers and practitioners. Since lockdown began in March 2020, Paurav took on running and cycling as new hobbies and marvels at how AI has helped him run more than 100 kilometres and cycle more than 400 kilometres in a month. That makes me tired just thinking about it.
And if you're wondering who I am, my name is Sean Riley and I'm usually to be found waving a camera in the face of computer scientists on the Computerphile YouTube channel, but for this they've let me put the camera down and pick this microphone up. Now I know Gopal has to dash off, he's got to be somewhere, so let's hear from him first. Are we all ready for Lockdown 2, the sequel?
Gopal: Yeah, today is the 12th of October, right, so we're hearing about, we're going to hear about the new measures that will come into place, so looking forward to hearing about that and just to give the audience some background, so we're doing this as part of the activities of the Trustworthy Autonomous Systems Hub. It's a large programme funded by the UKRI and yeah, it's great to have you all on the panel today and thanks Sean for doing this.
Sean: No problem whatsoever. So I think the first thing we're going to do is talk to Joel Fisher about track and trace and perhaps he can give us a bit of background as to why we should care about these things. Joel is an Associate Professor in Human Computer Interaction at the University of Nottingham. We'll come back to the panel very shortly, but Joel, where do we start with this?
Joel: Yeah, sure. I think, you know, when we started out with the TAS Hub activities, as Gopal mentioned, we were really interested in finding an example of a public-facing autonomous system that we can engage people around and actually is something that they experience as part of their, potentially, as part of their day-to-day lives and, you know, we came to thinking about the test and trace system in particular, of course, due to the much delayed app, which we now have and is now available for everyone in the UK.
I think the interesting thing there is that you have a sort of, even though you might say, well, it's not an autonomous system like a self-driving car at all, it's not like that, but you have an element of autonomy in the system in that there is kind of algorithmic decision-making at play, so you have, based on Bluetooth, proximity sensing and these kinds of things, you have an algorithm that kind of crunches those numbers and potentially tells you to self-isolate. There's the agency in that, you know, this thing can pop up and say, well, you have to stay at home for seven days or for, you know, whatever it is, 14 days and as such, you know, it has the potential to really touch everyone's life, everyone's lives in this country and it's kind of an interesting case study for systems that have a kind of level of autonomy.
Sean: So, just to play devil's advocate, it's just following rules, though, right? You know, there'll be a proximity based upon, I don't know, signal strength or whatever that it's estimating and if you are very close to someone for a very long period of time and that person has said that they have been infected, then it stands to reason that it's going to suggest that you isolate?
Joel: Yes, exactly, but I think, you know, when we, even though the example that you're giving here, you know, suggests we're talking about potentially quite a dumb kind of rule-based system, I think there's a little bit more at play than that. I don't want to get into, and I'm not an expert in the details of the algorithm itself, but there is a bit more at play, you know, there's some quite sophisticated things going on with the kind of signal strength of the Bluetooth, being, giving you an indication of how far someone is away or whether there's a wall in between, for example, and things like that. So, there's a more sophistication to the algorithm, I think.
But then I think as well what's important and actually that's true for a lot of autonomous systems, if you think about your smart speaker, that many of us have got a smart speaker at home now that you can just talk to, and it seems like another good example of potentially an autonomous system to an extent. But then actually the way these things work is that there's a huge amount of human labour going on behind the scenes, right, to make, to train these systems and add new training data and kind of teach new languages and dialects and accents so that these things actually work for the majority of us.
And they don't, and we know that often they don't work, but they work a lot better than they did when they first kind of came out and became more commercially available around five years ago. So, and with the test and trace system, you know, there is a huge amount of human labour going on. There's, of course, you know, there's the contact tracers that work behind the scenes and there's a huge amount of that happening as well.
And so, I think it's, again, it's interesting to think about these kinds of things as consisting not just of, you know, some sort of magic black box algorithm that just sort of magically learns and makes decisions, but actually there's a huge kind of ecosystem of different actors involved, of different kinds of people involved, including the general public.
Sean: It's interesting you say that because the original technology as put out by Google and Apple had this very simplistic sounding, well, if your Bluetooth is near some other Bluetooth, but I know that this particular NHS UK app has had the things like QR codes laid on the top of it and all sorts of other bits and pieces. So, I wonder how autonomous it is? What decisions is it making that we don't have any say in?
Joel: Yeah and, you know, I think, I mean, I think to some degree, I think that's a really good question. But if you've actually used the app, I mean, like me, you might have downloaded the app and you've had it on your phone now for a couple of weeks. And you might have had a notification, right, saying, you've been potentially exposed to Covid-19. And I had one of those notifications that says, the app is now validating the exposure. And then when you actually go into the app, there's no further information. So, there's no explanation as to where this notification is coming from, as to why you might have been exposed or when or where. And I realise a lot of that has to do with protection of other people's privacy. So, a lot of that is by design.
But what's missing is some kind of accountability that the app, you know, gives you an explanation for its inner workings of why it's come to this kind of decision, if you want, to give you this kind of warning. And that's sort of inconsequential in a way at this point, because it's just a kind of a warning that you might have been exposed. But if you do get that notification that you should be self-isolating, you know, I think it's just normal and natural that people want an explanation and might want to be able to follow up with a human being about why that is the case and whether or not that's, you know, and how accurate and how confident is, and can the app be in actually asking that of us.
So, I think there is a huge need for understanding people's views on, for example whether or not they want additional explanation and where they can get that additional explanation from.
Sean: I think it's human nature to start questioning who, when, why, what, right? So, I have downloaded the app. I have several of these notifications over the last couple of weeks and you start wondering, is that that person I walked past down the corridor? Was it somebody I sat in nearby when I had a coffee? And you just can't help but think about that. But when you think about what you want from the app, you want some kind of reassurance, right? And maybe it's just been not put together so well that it doesn't give you the reassurance that, hey, low risk, don't worry about it. If it was high risk, we'd tell you, you don't get that from this.
[00:09:47]
Joel: Yeah, and it seems so, you know, it seems a simplistic kind of proposition that the app is sensing proximity and if you've been exposed, it's going to tell you to self-isolate. Great, right? That's a simplistic proposition. But yeah, it seems too good to be true. And the reality is, I think, that a lot of people are avoiding downloading the app because they don't want to get that notification. And there's, obviously, there's a huge amount of other things going on around this. You may not be able to work, you may lose your income.
It's important whether or not people are then supported or feel supported or whether they're going to be without income and unable to put food on the table. And I think all these things matter. So, there's a kind of, we understand these systems as socio-technical. So, it's not just about the technology and how well it works or whatever. It's, there's a huge amount of, you know, these things are embedded in people's lives and in society. So, it touches on all these societal issues.
Sean: We are putting our lives in the hands of this, in this technology. But you mentioned earlier, obviously, there's a huge kind of machinery behind that technology, which is the track and trace system as a total. The app is a part of a larger thing. And perhaps, I don't know if this is the forum for this, but we learned that some of that machinery is working on things like Excel and not proper databases. Again, perhaps Autonomous would be better?
Joel: Yes, and before, I mean, you mentioned Excel and the other thing was that the app, this app is actually not compatible with some of the test results coming in from hospitals, right? That was kind of one of the first scandals that came out about the app, is that if you have a positive test result from where you've been tested in hospital, you might not be able to actually import it into the app. So, there is this kind of mundane incompatibility issues that I'm sure are being addressed and perhaps have already been addressed, that you see with this kind of technology playing out as well.
Sean: There's been a huge take up of this NHS Covid-19 app in the UK. As I haven't looked at the latest figures for download, but I know it was hundreds of thousands, if not millions of people have downloaded this app. Is this the first kind of mass adoption of this kind of automation in people's lives, do you think?
Joel: I think that's a good question and I think probably the answer is probably no. I mean, if you talk about automation, a lot of automation we now take for granted that we have in our everyday lives, for example, the washing machine. Most people have a washing machine. But when that came in as a new technology, it hugely changed people's lives. So, there tends to be this sort of, I suppose, this sort of pattern with new innovations and automation in particular, that very quickly we actually adopt a lot of these technologies and they become mundane parts of our lives and we take them for granted and we don't think about them as actually being innovations in automation, that assist us and help us in our everyday lives. But actually, our everyday lives are really full of mundane technologies that automate aspects of it.
Sean: Yeah, there's this idea, isn't there, whereby the word technology disappears when people get comfortable with something. I mean, at some point a pencil was technology, right? And now, you know, it's old hat in many ways.
Joel: Yes. I don't think anybody thinks about a washing machine when you say the word technology.
Sean: And that's quite important because I think, you know, washing machine we think of as, you know, this white good piece of kit that just does what we need it to do. And things like phones, albeit over the last few years it's accelerated, these are becoming a bit the same. People don't need to upgrade their phones because the phone that you got two years ago still does most of the things.
Joel: I think you're demonstrating it right now, Sean, because you're just calling it a phone. People used to call it a mobile phone or a smartphone. We don't need to say smartphone anymore because all our phones are now smart.
Sean: Yeah, it's a really important thing and it sort of becomes part of your working life. We use apps to help us improve our fitness. We use apps to help us find where we're travelling to. I have an Atlas in the boot of my car and I don't think I've looked at it for, I don't know how many years. We just become used to this bit of technology that just works, right?
Joel: Yes, and if you look at the contact tracing, the way it's actually integrated into the operating system of your mobile phone, then it's just like the step counter. It's just part of the health offering of what the operating system can do. So, it's becoming, it’s getting there and it's becoming this mundane thing that is just something that mobile phones can do.
Sean: Is there an issue though with us, this is a loaded question as you’ll appreciate, there’s an issue with perhaps us trusting these devices and these pieces of equipment with so much of our personal information. I mean, there's an advert running at the moment, a commercial on television for Apple watches, talking about how they'll measure your blood oxygen and your heart rate and et cetera, et cetera. The health offering, as you mentioned, is becoming more and more important to people?
Joel: Absolutely. I think that's right. And I think, yeah, there are issues with privacy and people are becoming more aware of them, I think as well, which is a good thing. And I mean, a lot of effort has gone into thinking about privacy regarding these contact tracing apps. I'm not sure they've all been addressed, but certainly I think in people's opinions and people's consciousness data, issues around personal data and protection of personal data are becoming more pronounced.
Sean: I understand you don't know exactly how the workings of this particular test and trace app and system works, but just speculating briefly for a moment, I know that the underlying technology from the big companies of Google and Apple is supposed to be private by design. But is it possible that apps like this are taking a huge amount of personal data, even if it's anonymised and building a picture of a population, actually?
Joel: Yeah. And I think there was, if you follow some of the technical discourse that was going on, especially in this country, in the UK over the summer, when there was an effort by the government to build an app with a different architecture, not a distributed architecture, which we now have, but a centralised architecture. And of course, with that, you have the issue that you are pooling personal information, like the contact tracing data, which is in and of itself highly sensitive because it gives insights about where people have been and how long for and so on. So,, I mean, it's the perfect surveillance tool.
In the end, though, the model that the government envisioned, the centralised model, and the reasons for why they envisioned this model are good, because from an epidemiological standpoint, that difficult word that we've all had to learn in the last few months, we would benefit from this kind of centralised view of the contacts and how it can give you a much more precise picture of how the virus spreads, as opposed to this kind of decentralised model, where individuals get notifications and so on, but the idea is that no central government has got that data that they can see and track essentially individuals.
So, I think you've got to weigh up the kind of pros and cons. And I think there are pros and cons to both of these models. And what we've ended up with now is the decentralised model. But there is also part of this, it's not kind of a pure decentralised model, because there is part of this app ecosystem, which is going to be centralised data as well. So I think, for example, when you are reporting symptoms and requesting, and you've been told to request a test and so on, you know, that's the point where you are then interacting via the app with the NHS Test and Trace, which is a centralised system. And you're no longer in this kind of decentralised ecosystem of contact tracing, which, you know, works purely on the basis of your operating system on your phone.
Sean: And of course, built on top of it, this QR code etc. etc. But I mean, going back to that idea of a centralised system, that would be more powerful in terms of mapping the spread of any kind of infections, and also perhaps being more strategic in terms of making decisions, which sometimes, without getting political, there have been criticisms for various governments, haven't there?
Joel: Absolutely, exactly. And that's absolutely right. I mean, and that's why you've seen, I think that's why it's not been straightforward to make decisions on which kind of app architecture to go with, because there are benefits to having centralised systems, as well as shortcomings. And probably, though, as people and as individuals who are privacy conscious, the distributed system that we ended up with now is the more viable one.
[00:20:09]
Sean: I think that there's another side to this, which is just how much algorithmic decision making is happening. You know, we're talking about potentially semi-autonomous systems, would you say? I'm, you know, equally concerned about things like feedback loops, what happens if something goes wrong with those algorithms, and they start to make the wrong decisions? And a decentralised system, though, especially, how are we going to know that's happening? I suppose, is that would that be fair to ask?
Joel: Yes, I think that is fair to ask. And I think the concern that you have, I think a lot of people share that, ultimately, you know, you might want to, you might want to be able to get an explanation as to why things are happening with this algorithm that might tell you to self-isolate, and so on. And I think this comes back to this overarching interest, we have this research interest in autonomous systems, I think, there is, there has to be a balance between autonomy and control, if you want, or, you know, autonomy and on the side of the human and autonomy on the side of the system. Because if you don't have that, then you are getting into these moments where people are mistrusting the technology and do not adopt the technology.
And especially with something like contact tracing, we know that it works best if a lot of people adopt it, and a lot of people use it. And if you don’t, if you have only got a minority of the population using it, and I think what I've what I've read, some of the numbers I read, is that the best uptake of these systems in some countries was, was, you know, was around 40% of the population and that was seen as, as a success. And if you think about what ministers have been saying is that they were hoping to get an uptake of 80% and I think we're nowhere near that.
I don't know what it is currently in the UK. I've not actually seen any figures so far. What the uptake actually is, yes, we've heard millions of times it's been downloaded. But that doesn't mean people are running the app, doesn't mean people haven't deleted the app. So we don't actually know exactly what people's views are and how successful this system is in the UK and I think that’s also part of the reason we’re doing this research.
Sean: At the beginning of lockdown, some people listening may know, but I make the Computerphile videos on YouTube and one of the things we did was a was a video about how this technology works. And there was an argument said that, if everybody doesn't adopt it, then it's not going to work at all. But I know the counter argument to that is you get a second order effect. If some people are using it, it's helpful. I mean, is that still the case? If some people are using it, it's still helpful.
Joel: Again I think that’s a good point. I think there is. Yeah, there will be, there will be a benefit from some people using it. I mean, if at the very least, you can, you know, describe your symptoms through the app and book a test through the app. So there's very kind of almost quite mundane things you can do interacting with the testing system. And if that means that more people might be getting a test, that's a positive thing. Now, that doesn't maybe necessarily so much help with the overall understanding of how the virus spreads. I think there are different things going on, I think, with test and trace.
And there are kind of, yeah, there are different agendas at play, if you want. And one is to understand that the national or international spread of the virus and another one is to get, you know, to help people get testing locally. And these things, of course, are important for different for different reasons. And yeah, so I think even though not everyone might be using or not as many people as hoped might be using the app, there will still be a benefit overall.
Sean: Finally, I'd like to ask, what can we learn from this particular, if you like deployment of technology for kind of other areas that might have autonomous systems kind of built in?
Joel: Yeah, I think that's a really good question and it sort of gets, goes to the heart of our research. I think what we will learn is, is really the complexity of how autonomous systems like this, which are public facing, multifaceted, are embedded in people's lives and where, you know, are so multifaceted that so many different aspects in people in people's lives are being affected by it. And as a result, we will hopefully be able to understand a bit better what matters when we design, or when the design, what we can say to designers and developers of autonomous systems to take into consideration the kind of broader societal embedding of these technologies.
Sean: Great stuff. Thank you very much, Joel Fisher for joining us and talking to us about track and trace and autonomous systems.
Joel: Thanks, Sean.
Sean: So Joel, they're talking about track and trace and autonomous systems. And I'd like to say welcome to Joel now he's joining the panel with Parav and Christine. So we can discuss some of those things that popped up. Parav, what do you think? You've used apps for your improving your own health and fitness as I understand it. What do you think about this then the health offering?
Parav: Yes, it is. When I'm thinking about it, and when I was listening to Joel, particularly, one of the things that came about Sean was this idea of trust and I think that is central to everything. In a way, I, funnily enough, when the lockdown began in March, you know, I was at home, suddenly my commute to work was stopped. And so I decided to use that commute to some other ends and funnily got into running and cycling and healthy pursuits, if I may say so.
And those healthy pursuits then led me to different kinds of apps. And they started telling me how much I was running and being, you know, data analytics minded person, I realised that there was so much data that was given to me. And anyway, it started kind of kicking that competitive side of me, but I trusted it more in terms of, so I had then two apps comparing what was each app doing in terms of if I ran 5km, did they actually show 5km or not, and so on and so forth.
But interestingly, that trust was built with experience and I think with this NHS Trace, the track and system, the problem there lies with the experience because people never had any experience of it in reality, when you think about it, with a health related particular idea like this, and so suspicion is bound to occur. We are by nature sceptical of new technologies and this is a new technology. And that is what is affecting us, I think that is what is causing us to feel you know, a little bit apprehensive about it. And so we have more questions than answers right now.
Sean: But some of that mistrust possibly comes from the, what's the best way of putting this, the origins of the app. I mean, we've been hearing stories of this failed project to make a previous version of the app, and then the new one. I mean, anecdotally, I can tell you about multiple people I've spoken to who've gone, “Well, I'm not going to download that, I don't want them knowing everything I'm doing.” Which, you know, sort of misses the big picture here. And also, from what we were just discussing, Joel, and I, the whole technology is supposedly built on the Bluetooth thing is anonymised and in theory, it should be completely anonymous, right?
Parav: Yeah, it is quite, quite an interesting observation that you know, and like we talked about earlier in terms trust is a multifaceted, multi layered phenomenon. And a lot of times we don't realise that. So, in a way, Google has also failed, and Apple has also failed and every technology company has failed with their own technologies multiple times, but they've kept on at it. And I think that is what we are doing here also. But who is the owner of the trust matters a lot. In a way, do we feel that the person who is promoting this message of whatever kind or application or for that matter, you know, product or service or anything? Is that person or is that entity trustworthy?
That trustworthiness is such a fundamental characteristic. And it is so easy to lose, and it is so hard to gain. It's like a toothpaste, you know, so hard to take it out. But once you have taken it out, it's so hard to put it back. And so it's the same way what is happening. So the first incarnation of that app, you know, touted to be the world class globally leading actually did not deliver the way how we thought it would deliver. And so it was suddenly the trustworthiness of that entity went down. And now as we are going into it, again, we are being more sceptical about it. We are thinking, is it really right for me? And then newer and newer questions are emerging. It can only be answered over time. So time is the only healer here.
But at the same time, we also need to think about that, who is promoting this message. And I think that is where the key lies. Remember, the first time when the lockdown occurred, when the message was save NHS, people rallied behind it because we trust in that system. That is the trustworthiness. That is possibly the pinnacle of our trustworthiness. And NHS was driving the whole agenda, as the driving of that agenda went from NHS towards government, a different stakeholder in our mind, suddenly the trust element changes, the trustworthiness element changes. And then a number of things have happened.
I don't want to go into the political debate of it, but a number of things have happened wherein we have got contradictory messages. And this contradictory messages, again, reduces trustworthiness. And so that layering of that trust, that cuts down again and again and again. And suddenly we again, suddenly the same entity is trying to tell us, you know, trust us because we have done it before. We start feeling, oh my god.
Sean: Well, it's interesting, the trust thing that you're just saying has built up your exercise regime. Christine, I know that you're a cyclist as well. Have you been using these apps or not?
Christine: I have done and similar to Paurav over the lockdown, I probably have become more active in cycling than I was before, albeit it was predominantly limited to indoor cycling, mainly because of the restrictions that we're under. And with that, actually, there was a huge uptake, at least from my side of trying different apps that were available. Actually talking about that, what I realised during that time is that these apps, even though the information may not be used, they do have an incredible amount of personal information about my health, about my activity levels, possibly about my stress levels.
If you think about it as a cyclist, you will probably have a heart rate monitor attached to yourself, possibly a power metre. With that information, from a healthcare perspective, this could be absolutely invaluable for actual diagnosis and early prevention of disease. But then on the flip side, given where that data is located, where it's stored, how can we open up channels to make that system and that data available to healthcare, to prevent that data from being misused by adversarial agents, but also aligned with what Paurav said, to ensure that users trust that that information is kept safe and handled in a responsible manner.
So from that perspective, I think there's a lot of psychology involved as well into what do people perceive as trustworthy? But also how do systems convey the technical details that are actually used in order to make sense of this vast amount of data that they have. And that is not an easy feat, because as an expert user, you might be able to actually understand this if you have an understanding of machine learning systems. Of course, you could look into the source code for things like test and trace, but that doesn't mean that the general broader public is able to make sense of that and I think we have a responsibility as people who work in the sector to make sure that this information is actually conveyed to the public and the end users.
Sean: I think it's really interesting that you mentioned how much data these health apps, which you voluntarily installed and possibly even paid for, have got and have access and are storing about you. I mean, I'm the same, I'm an addict for looking at the charts and seeing if I was faster on that section or this section. But then when you align that with what we were just saying earlier about trust and the track and trace system, I wonder if it's those same people who are paying to have an app tell them absolutely, in infinitesimal detail, about how their heart rate was when they were powering up that hill, who were worried about somebody telling them they may have been exposed to a virus.
It's quite interesting to sort of make that juxtaposition that actually, I think what it comes down to is where the kind of like, not governing body, but where the authority comes from. Where did the app come from, right? Did you pay for it? Did you choose it? Or is somebody telling you to put it on your system? What do you think about that, Joel?
Joel: Yes, Sean, and I think that's, again, a really good point. And also, I think I complete the triad here, in that I also have been, you know, doing more running and particularly running, actually, over the lockdown and also been using these apps. And yeah, I think it has a lot to do with whether or not you choose to download the app and seeing a kind of a personal benefit in it. Whereas I suppose the overall kind of premise of the contact tracing is that it's an altruistic thing to do. It's not something I personally benefit from. In fact, I am more likely to lose out because the app might tell me to self-isolate. And therefore, I know I'm not able to go out for a week or two. And that's, of course, a high price to pay, really.
But the overall premise is that it will improve, you know, the situation for everyone in the country and not just me or me per se. And so, you know, it works at a certain time, I think, where people are compassionate and understanding about the situation for everyone and want to improve the situation for everyone. And it's not a kind of a, you know, it's not something that's an individualistic driver, I think, you know, because Christine was talking about psychology there and I agree.
I think perhaps we're all getting a bit tired and bored of the virus now and the lockdown and the restrictions. And so, it might come at a bit of an inopportune moment where we're kind of losing steam and we don't really feel like we should be self-isolating anymore because, you know, we've been doing this for months and months and months. And that's just perhaps, you know, one of the complicating factors that play into whether or not people use or are inclined to download these apps.
Sean: I hate to make the parallel to this, but there was a feeling in one of the kind of world wars that, oh, don't worry, it'll all be over by Christmas, right? And I think there is a parallel, this could, you know, this coronavirus situation could be with us for years and us adapting and working around it. And it's quite interesting, you mentioned kind of the difference in drivers between the different apps, right? Altruism, I think, is key there. When I download an app to check whether I'm running fast enough or for long enough or for whatever, I'm doing it for me, right? Whereas, like you say, aligned a little bit with mask wearing, when you download the Covid app, you're actually doing it for other people. Is it worth trying to change that? Or is that just, is that baked in? What do you think, Christine?
Christine: I think that's quite an interesting question to ask, actually, and quite a difficult question to answer. I think what I would be interested in seeing first is, I mean, it was briefly mentioned earlier on, that the UK was quite late to the table, actually, in terms of the uptake of the app itself. And I wonder what the factors were in that sort of delay in the response from people and taking it up. And I mean, seen from a population point of view, it's, I don't think the deciding factor in that was necessarily the altruistic aspect, but potentially just the mistrust or the scepticism, perhaps is maybe a better word for it. And what were the factors that actually delayed the response? And is now, the delayed response also a cause for potentially a reduced uptake?
Because as soon as, as Paurav said earlier on, as soon as there's a sort of limited trust, it's very difficult to build that up again and that might be potentially why we don't see as high numbers as we really should see, because it's in our own interests that we actually use the Test and Trace app to make sure that all of us and everyone around us is safe.
Sean: Paurav, you know, marketing is one of your areas. Was this just a, is this a communications error?
Parav: I wouldn't say so, Sean. I think it's beyond, a little bit more than communications. As Joel and Christine were mentioning about altruism, we are by nature not altruistic. We would ask a question when we do any activity is, what's in it for me? There is a tinge of selfishness in all those activities we do. And in that regard, when we think about it, public at large, what you start looking at is, is that when people cannot understand that how their actions are helping others or themselves, they are not likely to engage in that activity.
[00:39:54]
So when you think about it, for example, when you or I or Christine or Joel have taken on those running apps and put them in and started looking into it, those apps are telling us, with using their heart rate data, that power meter data, what are we doing with it? When it goes into this NHS app, it is almost a black hole right now. Joel was mentioning it very nicely saying that I've received a notification, but what do I do with that notification now? I can't see anything. So there is no clarity.
In a way, it's not just about communication, but it is the actions associated with it. And I understand that there would be data privacy issues and so on and so forth. But the app does not tell me. So when I click on that notification, and then the app should tell me that now, “Please wait, we our people are working through it. There is a process. Let me show you a flowchart how that process works. And so possibly in a week's time, we will come back to you. We apologise, but it is a complex process.”
Now, if that clarity and communication was provided, so it was not just communication, but it's that clarity where people can understand what it is that you are doing with my data. For example, my watch when I'm wearing that and I go running, when I come back, that watch tells me, you've run this much, you've run at this rate, your heart rate was this, you went up and down like this, and so on and so forth. So you are in this zone, you could do this better. Then I know what they are talking.
But if they just told me, you could do better. The next day, I'm not going to wear that watch. What do you mean by, you know, I could do better? So, so these are some key concepts, which are right now, you know, this is a public facing system, we have to approach it in the way how public understands it.
Seam: So this is kind of an issue with positive feedback or the lack thereof, right? I mean, I've got to be honest, I mentioned earlier, but I'm a sucker for those charts, you know, if it had said, “Hey, you've been near 26 people this week, and two of them may be issues, but you know.” Then suddenly, I'm starting to think, okay, I can see that, that's maybe low risk, so I don't have to worry so much. It's perhaps just down to the implementation of the app itself, the way it's been put together.
Because just to reiterate, you know, when people talking about altruism, the vast majority of people who do say volunteering, or do something for a vocation rather than for money, always tend to, this is completely unscientific, but tend to have a vast amount of positive kind of feeling for doing those altruistic things. I, you know, would have to go and find some reports to give you the scientific numbers on that. But certainly, I understand that's the case. So maybe, maybe it's just that it's just not being put together very well?
Paurav: Yes, in a way, if you look at it, you know, from a public campaign perspective, I would say, you know, I would consider London 2012 as a phenomenal campaign, the way how it changed Londoners behaviours. You know, we used, in that period, we were trained almost to, not to use the underground, to move away from that public transport system, to provide voluntary services in different, different ways. There were so many volunteers available for that system. And that was also a public facing system. But people knew what was happening, what was the outcome, what was the end point, where things are going? Why am I doing this?
And I think that those kind of key critical questions are right now missing from every public debate or every public communication that is coming from particular stakeholders, who are engaged in the other side of the table. So the people as us, we are asking like what your interview with Joel was asking very, very clearly, you know, Joel was saying that I got a notification, I don't know what to do. Now, if a professor level person is also finding it that difficult, then I can understand, you know, I don't want to stereotype, but any person who is not engaged in this process would find it very difficult to grasp.
Sean: That's probably a good point to draw us to a close. Have you got any final thoughts, Christine?
Christine: Yeah, I think there's that aspect. And there's also the issue of, I mean, relatively speaking, we have not been in lockdown for a comparatively long time. I mean, London is a very good example, but their action was imminently needed in order to prevent any further issues. And even though action is imminently needed in this scenario at the moment as well, there's still a large number of people who are shielding at home who are choosing to isolate, who are choosing not to go to work or cannot go to work because workplaces haven't opened.
And then there's also the fact of how many people have access to the technology and have the technical capability of installing apps themselves? Are we actually finding ourselves in a scenario where potentially, maybe, there might be a possibility that people who are the most vulnerable, as in of a particular age, are also the least likely to be able to install this app on their own phone and actually use it to the extent that they should be using it. I think those are all factors that perhaps need to be taken into account here.
Sean: Yeah, that's a really important point, isn't it? That we talked about, you know, even if you've got 100% take up, that's 100% of people with smartphones. Even those people who have smartphones don't always have them in their, you know, in their pocket or whatever. So yeah, we've got a huge sector of people, just as you say, a lot of them are the key people who are actually vulnerable. It's really important to consider that. This is just one tool in the fight against Covid and maybe it could be better. Is that all right? That's probably where we're going to leave this. So, Paurav, thank you for joining us. Christine, thank you for joining us. Joel, thank you for joining us. Gopal had to step out, so hopefully we'll hear more from him next time. Thanks very much for joining us on the Living With AI podcast and we hope to see you next time.
If you want to get in touch with us here at the Living With AI podcast, you can visit the TAS website at www.tas.ac.uk where you can also find out more about the Trustworthy Autonomous Systems Hub. The Living With AI podcast is a production of the Trustworthy Autonomous Systems Hub. Audio engineering was by Boardie Limited and it was presented by me, Sean Riley. Subscribe to us wherever you get your podcasts from and we hope to see you again soon.
[00:46:16]