Justice, Disrupted

Karyn McCluskey and Claire Feasey – In the age of AI, can justice be smarter?

Byron Vincent Season 2 Episode 1

National Event 2025 Special

At ‘In the Age of AI, can justice be smarter?’ expert speakers: Claire Feasey, Harmeet Sandhu, Dr Susie Alegre and Shami Chakrabarti gave an overview of the potential contained within AI for improving efficiency and driving change, whilst keeping a clear-eyed view on the rights and well-being of individuals. The talks were followed by an audience Q&A.

In this episode:

Karyn McCluskey, chief executive of Community Justice Scotland introduces the event.

Claire Feasey from IBM presents – Claire works closely with public sector organisations to build innovative technology solutions: including automation, AI, and digital transformation – that address some of the unique challenges faced in justice systems.

Morning everybody. Oh, it's so nice to see everybody in person. I hope you enjoy it. We've got some phenomenal speakers and to talk about a really interesting subject. I'm so old now that I feel like I could be carbon dated. Um, I remember when we first got internet access and policing on a single terminal and an office of 40 intelligence professionals and we were, you know, the bit when the little wheel was going round and round and it feels almost laughable now given how far we've come and today we're going to talk about something that's evolving even faster. Um, artificial intelligence and justice and the pace of change in AI is absolutely staggering. what we talk about today will be out of date by next week. It's moving so fast. Um, and you're going to hear today from experts in the field about um about what's what's going on. And that's exactly why we need to talk about it just now because AI is transforming how we work in justice particularly and in community justice. How we analyse data, how we assess needs, how we deliver services both locally and nationally. I know some of you in the room will be early adopters. I am an early adopter. You've probably experienced tools already like copilot chat GPT or Deepseek. Although I was trying to do some work um yesterday on Deepseek on a government laptop and I typed it in Deepseek only to find out even the term is banned and these all these alarms came up. Others are being more curious and that's okay. This is a space where curiosity and caution can coexist. Ivan McKee, the Scottish Minister for Public Sector Reform, who's doing a big event next week, launched his public sector reform strategy earlier this year, and he identified AI as a key enabler for improving productivity and service delivery. The UK government AI and justice strategy has outlined a bold a bold vision, and I'm going to read you this out because I suppose it it shows where they're thinking they can go. AI shows great potential to help deliver swifter, fairer, and more accessible justice for all. Reducing court backlogs, increasing prison capacity. I'm not quite sure about that. Improving rehabilitation outcomes and enhancing victim services. They talked about reducing administrative burdens with secure AI tools like transcription and document processing. And you're going to hear and and see some real professionals in this this morning. Increasing capacity through smarter scheduling, thinking prison logistics, improving access to justice with citizen facing assistance, enhancing case handling and service delivery, enabling personalised education and rehabilitation, tailored training for workforces and for those in prison and community justice, and supporting better decisions through predictive models like assessing risk of violence and custody. AI is already becoming embedded in our courts, our policing systems, and our prisons, even in the algorithms that influence decision about bail, parole, and second chances. But here's the tension. Our justice system is built on fairness, transparency, and accountability. And AI is built on data. I used to be the head of intelligence analysis, you know, for policing and and for decades. I know that justice data is biased. It's incomplete and sometimes it's opaque. And it's been really difficult to me to miss the conversations recently around live facial recognition and the levels of misidentification and the concerns about loss of privacy. And I think that's a risk that we have to confront head on. And I mean all of you, regardless of where you are, you need to be engaged in this debate. So today you're going to hear from some phenomenal speakers um exploring the possible from Claire Feasey from IBM to Harmeet Sandhu's work um in AI in the Scottish Courts and Tribunal Service to the powerful voices of Shami Chakrabarti and Dr. Susie Alegre who are who both spoke so eloquently about AI last year um at the book festival and about Susie's books human rights robot wrongs a manifest for humanity in the age of AI but I just want to say for those that are really interested in this I read the um the Davos report by the world economic forum what they estimate is that by 2030 40% of the jobs that we have now will no longer be present by 2030 40%. And what they say is inequality will widen because we're not training people quickly enough for the new jobs that might be around. And I think that's that to me feels quite worrying. So things like my own husband has made been made redundant because his job which was in data and finance is already been done by AI. So things are changing rapidly. But even as we embrace speed and efficiency, we have to remember that justice is delivered by people. So you can speed the courts up, you can speed justice up, but eventually it meets a human service in community, in third sector, in prisons. And that means that you're going to have more demand for human support and more funding to sustain it. For colleagues worried about the environment, every ch GPT session takes a bottle of water. billions of litres of water required to cool the servers. But there might be an opposite to that whereas people are travelling less. So there are a yin and yang in terms of the environment. So I just want to finish up by saying there are no stupid questions today. But first off I want to invite up and I'm not going to read out everybody's biographies. I want to invite up Claire Feasey who is head of justice transformation and public safety at IBM. Welcome everybody and good morning. So picture the scene 8:45 in the morning Glasgow Sheriff Court as a young woman who's been 12 months on remand perhaps in HMP Stirling waiting to be heard. Her case is thrown out. She spent 12 months on remand behind bars. Her life is immeasurably changed. Absolutely changed. She's probably lost her home, her job. She has a totally different outlook today than she had a year ago. She's one of 57% of people on remand today do not receive a custodial sentence. 70% of those are women. We don't understand these figures and and we'll talk about stats later on, but we don't understand these figures as to whether that 57% is because they've already served time on remand or whether they're acquitted or whether the fact that uh there was no evidence and their case got thrown out or whether it actually just ran out of time. This is huge and this is not a statistic. This is a human being and this is the whole point of today is to talk about the humans that are behind justice and the humans that are affected by justice to understand the role of technology playing in justice today. Here's me. It's a great photo taken inside prison. I spend a lot of time inside prisons. It wasn't something that I ever imagined I'd get in my career. Um I'm actually on a recruitment drive um talking to offenders about work and life outside the prison gates once they are released. As part of my work I spend a lot of time with probation services, a lot of time in courts and a lot of time with prisons to the fact that I've actually got an IBM rugby team. Um and we play with SARS inside the mount and we work with sports rehabilitation with offenders. So it's really important to me to understand the actual humans that are behind justice that are using justice today and to look at how technology can with transparency and human centric approach actually transform how justice is delivered and how justice is experienced by those going through it. So for me it is not about handing the decisions of justice about people's liberty to a robot. This is about expediting justice and giving the right tools to the people so that they can speed up courts, reduce the backlogs. Karyn said it all. I'm I I don't need to repeat, but actually to expedite justice to actually go through speeding up swifter resolution so that victims are less re-traumatised through turning up at court cases that are adjourned through experiencing that crime again and again to make sure that victims receive their resolution and that justice is served swiftly. But the decisions remain with the humans about an individual's liberty. Justice delayed is justice denied. We've heard it many times. We hear it in England and Wales. And if we have such delays and such long backlogs of 12 to 36 months for a case to be heard, then justice delayed is justice denied. And actually justice opaque is justice mistrusted. AI if it cannot be explained should not be used in some areas especially within this environment is a very valid concern. If it's opaque and we don't understand it how then can we use it? But why now? Well, okay. So, the criminals are using it. They don't have a budget. They don't have anyone telling them they can't use technology. They have industrialised scams, fishing attacks, deep fakes, use of cyber crime. It is going through the roof and it's in their hands, unbidden, and in use today. We will see more and more cases come to court about cyber crime and crime using AI to personalise their attacks on you, synthesising the voices of your family members to scam you out of money. This is huge, but it's also in the law firms. The law firms are in an arms race for technology at the moment to have the best AI. There's actually discussions in law firms today as to whether they should charge by the hour anymore or charge by the six minute time frame or whether they should charge by the outcome. Because what they're doing is they're using AI to summarise information, to catalogue it, to chronologically order it and to build and collect their opening summaries and their closing arguments. And they're doing those personalised to the judge sitting in front of them and to the jury to the side. So if they are using this and in in the states there's an AI agent that's passed the bar. there's AI firms of law firms actually providing this information. So if they're using it, we need to even up the balance, right? We need a a level playing field and we need to understand how defence firms are working so that the prosecution has the same level playing field of tools and capabilities available to them. So we need to build AI literacy. We need to understand it. Everybody needs to understand it from social services right the way through to prisons including courts the policing everybody needs to understand AI and the risks inherent within it uh and I'm going to talk about what AI is and how it works but we need to start with a base level understanding because there'll be lots and lots of um personal sort of early adopters in the room that have helped their kids with homework on chap GBT Um and um I'm one of those. Um but chat GPT is open to the internet. So anything you put on it goes out. Copilot's hosted in the states. Do we want information going to the states? There's lots of information that is moving around and we need to make sure there's security and uh privacy embedded. Let's look at what AI is. Artificial intelligence is machines learning like the human brain. They build neural pathways. So just the same as a child learning a new language or a new subject, the machine learns that way. And actually when you're choosing a new school for your child, you actually want to know who the teachers are, where they trained, maybe you need the same with these agent bots. you need to know how they're trained, what information is going in, and how they work. So, we are in this huge paradigm shift today where with all um sort of revolutions ahead of us that that have gone before us, shall I say, um the printing press and the steam engine and all of these things. We've decided we've decided when we take those on and when we adopt them. And we've had a do-it yourself mentality of okay, we'll we'll adopt this now and we'll move it forwards. With AI, it's being done alongside us and to us. You only need to talk about camping. Happened to me the other day. And suddenly you're bombarded with adverts for tents and ground sheets and so on. So it is all around you. It's on Alexa in your home. It's on your watch. It's on your phone. It's listening to you. It's growing data on you. and it's starting to be done to you. So, we're moving from a do-it-yourself to do it for me. And at IBM, we were there when AI was first coined, 1956 Dartmouth conference. We had a guy there coined the phrase artificial intelligence. And we get very confused because we talk about AI as though it's everything, but there's a lot of automation which is actually processes that are automated that also take up a bulk lot of of of how we work. However, we built a machine that learned chess and beat Gary Kasparov. We built a machine that was called Deep Blue. We built a machine called Watson that won Jeopardy. And we've continued to do this, but we've done so on the basis of transparency, ethics, explanability, understanding the machine and what it knows and how it has learned. And if you do not know those things, you shouldn't be using it. With the AO AI pyramid, we have ethics at the very core. And we're going to go into each of these because ethics is really important. Ethics gives us the privacy around data. It gives us the guidelines around transparency. It gives us the guidelines around explainability because as we said earlier, justice opaque is justice mistrusted. There's also information. As we said earlier, we don't know of that 57% of people on remand that don't receive a custodial sentence. How do we break those stats down? What is the data behind it? And how do we observe that data? And how do we train people so they have better information? Because if you can't measure it, you can't manage it. Then there's the governance layer. How are we actually building the policies, the rules, the procedures in in order to maintain what is going on with AI and automation in the organisation or authorities. Then there's the tools. What tools do we use and what tools do we allow? Proof. How do we build pilots and then personalisation which is absolutely where we should be going but you can't do it without getting the ethics right. So first of all I'm going to leave explanability to the next slide. Let's look at fairness. We know that these agents or AI agents can learn bias. Humans inherently have our own bias. You only need to look at some of the BBC documentaries going on at the moment. So we need to be able to test these models and uh audit them and ensure that they are treating individuals with equality. It's absolutely paramount that bias is stamped out of these models and you know they should be less biased than human beings. Robustness they need to withstand malicious attack. They need to be able to stand up and not be hacked. Now models should be able to be used as IBM's models are in critical national infrastructure used in fraud detection used in governments and um healthcare institutions and so on. They should be robust and be able to carry information securely. Transparency again we need to learn how these algorithms are built, how these models are tested and be open and honest about exactly what has gone into the models in the first place. and privacy. This is not just about GDPR. This is about human beings criminal records, financial records, and everything about them. So, explanability. If your son comes home from maths and he goes, "Right, I've got this gnarly math problem." The answer is six. Well, is it really? I have twins. I have one twin who is really diligent and I have the other twin who's complete chaos. and they'll both go, "Yeah, it's six. Where's the explanation? How did you get there?" These models have to be able to give you the reference ability because if you're going to use these models to summarise case information, witness information, and all of your case documents, if it's going to reference precedent cases, other cases, it needs to be able to tell you which ones, where they are, and link you to them. It needs to be explainable. You can't just have it give you the answer and say it's six, is it? I did a level math. This one really got me. Information training or data observability. We do not know how Scotland runs today. A lot of it is in the dark in terms of data in terms of the types of cases that are recalled. um the number of people that are on probation or on community service today that are recalled and why they're recalled. We know in England and Wales that 80% of the recalls are through license violation rather than reoffending for example. We don't understand the recidivism rates. So reoffending data observability is huge. If we can see it and we can measure it, we can start to understand the size of the problem and how to manage it. And again, training. Information is about training. The more people understand, the less they're going to go under their desk and get on chat GBT to speed up their work in a shadow environment, which is really dangerous, and go on the open internet to get some quick answers. Governance, stop the shadow IT, have compliance standards, policies, and have that across the board. And that's not just in courts, that's also in prisons and across probation, across social security uh and and social services, across policing, there needs to be a full set of policies, rules and so on as to how this is used because without that and this is moving all the time. It needs to be curated. It needs to be understood and it needs to be followed to the letter. So it needs to be policed because AI itself is inherent with risks. AI is like having a thousand graduates working for you. These graduates could be schedulers. They could be private assistants, personal assistants. They could be document summary people. They could be people that do chronological ordering. It's like having all these graduates, but some of them are going to make stuff up. So you need to have humans across in the loop checking the work that's coming through and having the right governance in place helps you maintain that. And then the tools, what tools are we going to use and how are we going to use them? How are they locked down? And how do we make sure that those tools are used by everyone and are shared across all the people that are trained and which ones are approved and which ones are not approved, which ones are banned and you obviously can't get on to deepseek. So that's quite a good thing I would say. But actually the choice of ch tools and some of these tools will automatically transcribe your notes for you. They will automatically summarise information for you. They will do lots of this stuff off the bat. Document management, a lot of your admin gone. Then you get to the fun stuff which is piloting. And Harmeet's going to talk about this later on. Piloting um use cases with a solid business case behind them. piloting use cases for searchable evidence, ordering that search, chronological evidence, smart scheduling of courts, smart scheduling of prison spaces and so on. And personalisation is where for me the grail is personalising the victim's journey and victim support. This is huge, right? When you speak to victims, victims don't just want a resolution. They also want whatever's happened to them to not happen to anybody else. And actually making them wait to get to trial, making them turn up at trial and having cases adjourned and spun out is simply not fair. Personalised journeys for probation. In England and Wales, probation spend 70% of their time on admin and paperwork. This is transcribing notes from their meetings with the probationers but also producing pre-sentencing reports and all the reporting function. They did not join the job to sit at a computer writing notes. They joined the job to actually help people rehabilitate their lives and turn their lives around but only 30% of their time is spent face to face. And with AI, you can change that the other way round so that 70% of their time is spent face to face with offenders and and ex-offenders um and helping manage offenders in the community. So really super important. So let's look to the future. What can we do? Because I've told you some of the risks and some of the inherent challenges with AI. Here we go. Quicker courts. So at the moment we know that courts have got real backlog delays and there is the ability to have automated transcriptions. I know that there's legal policies around that with government policy and so on. But automated transcriptions and the ability to search transcriptions, smart listings, so checking that the evidence is available to hold the case in the first place and smart scheduling of those court cases. Disclosure reduction. You can actually do redaction with a machine and then have humans check it very quickly, very easily. But also, let's look at what's going on in Germany. So, in Germany, they actually have two agents, one called Olga and one called Fulkar. Now, Olga and Fulkar are AI agents that deal with block standard cases in the backlog. We know in England and Wales that today each magistrate and my boss is one spend an hour per sitting for every magistrate sitting across the land with a pile of paperwork and that pile of paperwork is speeding fines, parking fines, transport for London fair dodging, TV license evasion and they go through each of them and there is a set number of outcomes. So they look is the individual employed, what was the reason that they haven't paid their speeding fine. They look at whether I'm employed, where it was, was there a reason. In Germany, all of that gets fed through a bot and the summary gets presented to the judge. The judge can then click, Claire Feasey, what was she up to? Go straight into the file. 35 in a 30 limit. Brad Park Road on her way back from the dog groomers didn't pay a fine in time. She gets this fine. She gets this number of points done. There will be the outline cases. You know, someone's driving someone who's giving birth in the front of the car. Maybe that's a uh different circumstance and they receive a different judgement. But the outline cases actually go to the judge and the judge makes a decision on them. But the summary is there that they can click and check through. And once they're happy, and they use this for airline compensation in Stuttgart, they use it for diesel gate, for cars. But once they're happy with all of the summaries and they check into the case notes, they just press action. All of the fines, everything is actioned, done, sent out to the individuals, done. this hour that every magistrate in in England and Wales goes through, justice of the peace here in Scotland um is is cleared so they can spend their time working on criminal cases and and and civil cases that are more important than speeding fines and parking tickets. Actually clearing the backlog of stock standard cases is a very easy use case. Smarter justice for victims. I spend a lot of time talking to victims and a lot of the victims want personal support. They want humans. But some of them don't want a human. Some of them have been through such traumatic experiences that they actually feel more comfortable with a monitor than with a human being. But isn't it up to them to decide how they experience justice? If you could empower them with an app and support that gives them real time updates on the case, what's going on, when they're needed at court, give them a virtual tour of the courtroom, tell them when their support person is going to be waiting for them. They can ask questions like, "Where are the toilets? Can I take medication with me? Am I allowed to take snacks? How long will I be waiting?" All of those things are something that a victim should be able to decide how they experience it. And so often we forget the victims in the whole piece about criminal justice. And it's up to them. If they want more human support, they should have it. But if they want to be able to use a facility where they could have real-time updates, they should have that, too. What about the people running probation or running offenders in the community? Now, we know how much time they're spending in England and Wales on admin tasks, writing up their notes rather than spending time face to face. And we know that face to face time actually improves the human experience and actually improves the rehabilitation. So, wouldn't it be great if you had something drafting your reports for your review and your input and then you could edit something that's transcribing your meeting notes for you? something that's monitoring the electronic monitors of the people in your care, the team of people that you're looking after and actually linking all of that up. So your admin, it's like having this bunch of graduates working for you, right? You've got all of your admin tasks, all of those repetitive paperwork heavy tasks done for you. Then you can check them, amend them, and send them out. Give more time for face to face time. It's also possible for us to link up the data today, electronic monitors today, send out a ream of data, and you actually have to go and look for Claire Feasey's data and where was she on Friday night? Was she actually within her house by 8:00 at night or whatever the license conditions were? But wouldn't it be great if you could see on a map all the breaches real time and prioritise your case loads as someone working in social services or justice services and and actually prioritise who you look into and who you support first. So to be able to do that for people that are offenders in the community is huge. But you don't actually need an electronic monitor. People could choose to drop a GPS pin. They could be asked to drop a GPS pin. This is not mass surveillance. This is scaffolding to help them rebuild their lives. So, let's look at offenders. I spent a lot of time in prisons with offenders. And offenders are really interesting. Some of them have said to me that the one thing that they don't get that they really wish that they had was praise. They've never had respect. They've never had praise. We see a lot on the rugby field. They absolutely love to be able to talk about that amazing ball that went down the wing, the amazing try that was scored. Um, they love the pat on the back. And when I talked to them, I sort of said, "Well, would you like to have a streak of of achievement of you've been on these courses, you've done some training?" Oh, yes. I said, "Would you like nudges to tell you to turn up to your probation appointment?" It's not unknown in England for 500 appointments to be offered, 27 to be accepted, and five to be turned up at. I can't book a dentist appointment without having a nudge on my text saying, "You got a dentist appointment, and if you don't turn up, you'll be in trouble." So, to have nudges, to have the ability to warn them, you're you're going close to an exclusion zone. You shouldn't be where you are. Um, but also submission for exceptional circumstances. In England today, it takes two weeks of putting in a submission for exceptional circumstances to attend a funeral or to go to a sick loved one's bedside in the hospital. Or in my case, the rugby captain who had left prison. He was meant to play at Saracen's Stonex Stadium. He was so excited. um and he didn't get his two weeks limit to ask to be there. So I went to support him and he wasn't on the pitch. He was still in Norfolk and not able to go. But if we've got these streaks of good behaviour of turning up to appointments and actually making those appointments, surely he should be able to real time be able to say, "Can I go to this? You can track me. I'll drop GPS pins whenever you text me. I just want to be on the pitch. And maybe he could have been there. Okay, so maybe he could have been there also to know where he is today. Where can he get drugs and alcohol support? Where can he get to a GP? Where can he get to a library for free Wi-Fi? All of the things that are around him, community service, charity support, all of those things you could support him with to help him rebuild his life. We've talked a lot about observability and data, but let's look at dashboards. We could reduce outstanding community service hours very quickly and reduce the breach rate if we could nudge people into attending their um their appointments. If we could nudge them into actually being part of owning that rehabilitation journey and being in control of it, we could start looking at technical breaches. You know, we have lots in the UK in England and Wales of people that don't charge up their tags or, you know, forgot to charge up or it's not working anymore and suddenly they're back inside prison. So, we could actually start looking at bringing those technical breaches down. And then there's rehabilitation. We can actually link up some of the amazing work. And I must admit I am blessed for having been able to watch the um documentaries on Barlinnie, on HMP Grampian, HMP Stirling, and a huge shout out to the people that work in those environments. The the staff in those um documentaries were outstanding. The work that they do in these therapeutic wings is just phenomenal. But that can carry on outside the prison walls with continued mindfulness training, continued support. And for me, that can go on in an app, right? It can learn you. It can get to know you. It can give you these crisis support. So, let's look at this in real time. This is a real product that we have with Veterans Affairs in the States. It's called Grit. I love the name. Get Results In Transition. But the interesting thing is veterans have the same challenges as people leaving prison. They have the same challenges with employment, mental health issues, suicide and self harm. They have the same issues with their health. They have the same issues fitting back into community and back into civilisation. And what we've done is we've built an app that we're rolling out throughout Veterans Affairs for vets who leaving active service. They have the same same challenges, right? What this does is they get mindfulness training. They do a mental well-being check-in. They get mindfulness training that's tailored to them and where they are. It also links them to employment sites. 41,000 job sites links their skills and their location to what's available around them. And we could do that with voluntary service as well. We could link you to community service skills as well. We also have a link to the squad, to support, to the team, to people going through the same thing. So they actually have that still that battalion mentality that they had when they were in the military. And this support goes on and on. I think the most telling piece that that I'll leave you with is that within this app is a crisis button. When people leave prison today, they leave with a wodge of paper, their license conditions, 35 quid in in England, and off you go and turn up at your appointment on Thursday. Um, if you are struggling with mental health issues, you're not going to take the time to get through that raft of paper to find the Samaritan's helpline. you're not going to take the time to find how you can look for employment or sort of u community service or charity support. So actually this crisis button has proven to be a lifeline for the first tranch of rolled out veterans that that were on the first pilot. 15 of them hit the crisis button and were linked to someone to help them. That's potentially 15 lives saved. 15 families with their son, their dad, uncle who have got the support that they need. And this is support that people leaving prison need. And this is a real, you can look it up. It's all on the web. Uh a real uh real situation. And here's Scotland. Thank you for having me, Scotland. Scotland's just what an amazing country. From the islands to the highlands to the cities, Scotland is so diverse and you have your own challenges with that diversity with the sort of different regions and different um areas of authorities and linking all of that up into one clear strategy for the use of AI is going to be a challenge. Right. But with 8,000 people in prison today and 25% of those on remand, you have the ability to lead the march. We've heard about MoJ's strategy for the future and their piloting transcription tools and they're piloting summarisation tools and they've got lots of little pilots going on. But Scotland could actually lead the march ahead of England and Wales and ahead of the rest of Europe. Scotland could by taking very small steps in a very determined way with transparency, explanability underpinned by ethics, underpinned by the right governance and policies. Scotland has got the ability to lead the march on this for a digital age where justice is swifter, accelerated, and our duties to the public safety are met. Our duties to victim support are met. And our duties to the public purse are also met because we're doing more with less. Budgets are never going to increase. We need to do more with less. We need to look at the commercial sector and see what they're doing and learn from them. But yeah, justice can be swifter and all of those repetitive tasks can be taken away. So, thank you for having me. It's been my absolute privilege and I look forward to the rest of the day. I hope you enjoy yourselves. Thank you so much.