AXREM Insights

S5 E9 Imaging IT with Impact: Manifestos, Medtech and the Future of Radiology

Melanie Johnson / Sally Edgington Season 5 Episode 9

In this episode of AXREM Insights, Melanie Johnson and Sally Edgington speak with Bob Child, Chief Commercial Officer at Soliton IT and AXREM’s Imaging IT Special Focus Group Convener, and Saduf Ali-Drakesmith, Head of Sales at Smart Reporting and Vice Convener. They reflect on their careers in imaging IT, share stories from the early days of digital transformation in radiology, and discuss their passion for improving patient care through innovation. The conversation highlights the strong sense of camaraderie in the medtech industry and the invaluable role AXREM plays in uniting competitors, colleagues and policymakers to work collaboratively.

The episode dives into the Imaging IT Manifesto, launched at Bletchley Park, which outlines the group’s vision and calls to action on topics such as cybersecurity, interoperability, AI integration and national policy alignment. The guests underscore the manifesto’s ambition to transform imaging services across the UK and explain how the group is actively working with key NHS stakeholders to turn these priorities into action. 

Documents mentioned during the episode include the AXREM Imaging IT Manifesto, the Strengthening Cyber Security in UK Healthcare, and the Multi-Factor Authentication (MFA) Guidance.

AI SGF PODCAST S5 E1

Rad Magazine sponsor of AXREM's UKIO Drinks Reception and Leading Publication in the Medical Imaging and Oncology Space

Thanks for listening to this week's episode

To find out more about AXREM check out our website HERE
If you are interested in joining AXREM as a member CLICK HERE
To contact us CLICK HERE

And join us next time for more insights from industry.

[00:00.000 --> 00:02.760]  Welcome to Axrem Insights,
[00:02.760 --> 00:05.480]  developing healthcare through medtech and innovation.
[00:05.480 --> 00:08.240]  Join Melanie Johnson and Sally Edgington as they
[00:08.240 --> 00:11.240]  talk with our industry leaders and experts.
[00:11.240 --> 00:14.720]  Hello and welcome to the final episode in this series.
[00:14.720 --> 00:17.360]  I'm Melanie Johnson and I'm here with Sally Edgington.
[00:17.360 --> 00:20.360]  Today, we had the pleasure to be speaking to both Bob Child,
[00:20.360 --> 00:22.640]  Chief Commercial Officer at Soliton IT,
[00:22.640 --> 00:25.720]  and Axrem's Image in IT Special Focus Group Convener,
[00:25.720 --> 00:27.600]  as well as Saduf Ali-Drakesmith,
[00:27.600 --> 00:29.520]  Head of Sales at Smart Reporting,
[00:29.520 --> 00:32.960]  and Axrem's Image in IT Special Focus Group Vice Convener.
[00:32.960 --> 00:34.560]  Welcome Bob and Saduf ,
[00:34.560 --> 00:36.240]  and thank you for being on our show today.
[00:36.240 --> 00:38.480]  Now, let's get started by handing over to
[00:38.480 --> 00:42.160]  you to tell us a little bit about yourself and what's your story.
[00:42.160 --> 00:46.580]  Yeah, sure. Clinical by background,
[00:46.580 --> 00:50.960]  and then fell into clinical IT,
[00:50.960 --> 00:55.840]  so I've been involved in lots and lots of imaging IT projects,
[00:55.840 --> 00:58.280]  reporting one of the first consultant
[00:58.280 --> 01:03.560]  radiographers in the UK way back when.
[01:03.560 --> 01:06.880]  I currently work at Smart Reporting.
[01:06.880 --> 01:12.360]  We're an AI software company for reporting workflows.
[01:12.360 --> 01:16.680]  A little fun fact about me is I surf very badly.
[01:16.680 --> 01:19.200]  For those of you in the industry who have seen me over
[01:19.200 --> 01:24.000]  the last couple of years hobbling and on crutches and all that thing,
[01:24.000 --> 01:25.720]  that is all surfing related.
[01:25.840 --> 01:28.720]  As I said, I do it very badly.
[01:28.720 --> 01:34.120]  Oh dear. You just said you transitioned from clinical into IT,
[01:34.120 --> 01:36.000]  so what made you do that?
[01:36.000 --> 01:37.680]  I was made.
[01:37.680 --> 01:43.040]  You were made. Who taught you whatever words?
[01:43.040 --> 01:46.000]  Literally though, that's how it happened.
[01:46.000 --> 01:48.600]  My old radiology services manager,
[01:48.600 --> 01:51.240]  this was during the time of national program for IT,
[01:51.240 --> 01:53.060]  essentially said, Sadoof,
[01:53.060 --> 01:54.920]  go to this meeting and tell them no.
[01:55.080 --> 01:57.040]  I was like, yeah, okay, no worries.
[01:57.040 --> 01:58.960]  I came back from the meeting and I remember saying,
[01:58.960 --> 02:00.880]  Carol, you can't say no,
[02:00.880 --> 02:02.160]  it's a government thing.
[02:02.160 --> 02:04.920]  She just went, oh, just deal with it.
[02:04.920 --> 02:09.120]  I dealt with it and that was the national program for IT,
[02:09.120 --> 02:13.840]  PAX Procurement at Bedford General Hospital.
[02:13.840 --> 02:17.200]  We went from film to filmless over a weekend.
[02:17.200 --> 02:20.120]  Yes, I was crazy enough to think that
[02:20.120 --> 02:23.280]  a big bang approach would be the right way to do it.
[02:23.520 --> 02:24.880]  It was not, by the way.
[02:26.800 --> 02:27.840]  Not that it went badly,
[02:27.840 --> 02:29.680]  it was just if I chose to do it again,
[02:29.680 --> 02:31.960]  I would definitely not go big bang.
[02:31.960 --> 02:35.160]  So that's how I sort of fell into imaging IT,
[02:35.160 --> 02:37.760]  PAX, RIS related, and then from there,
[02:37.760 --> 02:41.600]  I moved on from a small district general hospital
[02:41.600 --> 02:43.040]  to a large teaching hospital,
[02:43.040 --> 02:46.560]  one of the largest, most complex hospitals in the UK,
[02:46.560 --> 02:48.040]  one extreme to the other.
[02:48.040 --> 02:50.880]  I went to UCLA and did a bit more there
[02:50.880 --> 02:54.360]  and really got into that digital transformation.
[02:54.360 --> 02:58.720]  And then I left about 10 years ago now
[02:58.720 --> 03:01.040]  and was headhunted out into commercial
[03:01.040 --> 03:04.400]  and that's what brings me here.
[03:04.400 --> 03:06.840]  I never look back, clearly.
[03:06.840 --> 03:09.000]  Sometimes look back, sometimes.
[03:09.000 --> 03:11.160]  You know what, if anyone's worked in clinical,
[03:11.160 --> 03:12.920]  those Friday, Saturday nights in A&E
[03:12.920 --> 03:15.800]  where you used to have a laugh and you know,
[03:15.800 --> 03:19.320]  there's just a certain camaraderie around working frontline
[03:19.320 --> 03:21.040]  that I think you do miss.
[03:21.040 --> 03:22.480]  I think you miss like the buzz,
[03:22.480 --> 03:27.480]  you miss the adrenaline rush of just, yeah, clinical.
[03:27.760 --> 03:31.880]  But then I see what good we can do on this side of things
[03:31.880 --> 03:34.440]  and yeah, then you miss it a little less.
[03:34.440 --> 03:37.160]  You still realize that you are having an impact
[03:37.160 --> 03:41.600]  on patient care, but it's just in a more strategic
[03:41.600 --> 03:44.080]  and vendor related way, right?
[03:44.080 --> 03:48.480]  Being able to shape what your peers and ex-colleagues
[03:48.520 --> 03:50.720]  are using to deliver patient care
[03:50.720 --> 03:54.200]  still means that you are making a difference.
[03:54.200 --> 03:57.400]  So you get your thrills now from surfing, clearly.
[03:57.400 --> 03:58.560]  Some, yes.
[03:58.560 --> 04:02.440]  As I said, they're bad thrills too, but I do.
[04:02.440 --> 04:03.720]  I love the water, I love it.
[04:03.720 --> 04:06.000]  I just, I'm just not great in it.
[04:08.000 --> 04:08.840]  Oh, thank you.
[04:08.840 --> 04:09.880]  And over to you then, Bob.
[04:09.880 --> 04:11.120]  Tell us a little bit about yourself.
[04:11.120 --> 04:13.000]  What's your history?
[04:13.000 --> 04:14.760]  Well, I'll start with,
[04:14.760 --> 04:16.920]  I can tell you the other side of Sadoof
[04:16.920 --> 04:20.960]  because Sadoof was one of my customers.
[04:20.960 --> 04:25.840]  When, so I can tell you the real Sadoof side.
[04:25.840 --> 04:29.400]  And as for surfing, well, what can I say
[04:29.400 --> 04:31.880]  about wetsuits, et cetera?
[04:31.880 --> 04:33.200]  But anyway, there's nothing.
[04:33.200 --> 04:34.040]  Shh.
[04:34.920 --> 04:36.280]  Moving on swiftly.
[04:36.280 --> 04:40.520]  Moving on quickly, so let's start by saying
[04:40.520 --> 04:43.120]  that I started as a debt collector,
[04:43.120 --> 04:44.960]  but actually as a credit controller
[04:45.040 --> 04:49.440]  for one of our major healthcare global companies
[04:49.440 --> 04:51.520]  where I remained 23 years.
[04:51.520 --> 04:55.160]  Starting in credit control got me into contracts
[04:55.160 --> 04:57.400]  and got me towards service.
[04:57.400 --> 05:00.960]  And that was really where my background was.
[05:00.960 --> 05:05.040]  And that took me into PFI contracts
[05:05.040 --> 05:09.480]  and the national contract, which is where I became,
[05:09.480 --> 05:12.520]  in theory, or linked with healthcare IT
[05:12.520 --> 05:15.160]  and all the various systems within that.
[05:15.160 --> 05:18.080]  So I played a major part in the national program.
[05:18.080 --> 05:21.920]  I was working, I was the healthcare IT manager for Philips.
[05:21.920 --> 05:24.760]  I was there 23 years.
[05:24.760 --> 05:27.280]  We delivered firstly the Sektra software
[05:27.280 --> 05:32.280]  and then Philips software into London with the LSP with BT.
[05:32.760 --> 05:37.440]  I then moved through and logical progression was
[05:37.440 --> 05:39.680]  as healthcare IT manager doing lots
[05:39.680 --> 05:43.720]  of different regional deployments, et cetera, at the time.
[05:43.720 --> 05:45.640]  And as Sadoof has already said,
[05:45.640 --> 05:48.640]  the major win of the national program was everyone
[05:48.640 --> 05:51.720]  in England became filmless.
[05:51.720 --> 05:54.040]  So moving from film to digital.
[05:54.040 --> 05:57.640]  And I think we can honestly say that the risk packs
[05:57.640 --> 06:01.760]  was probably the only success of the national program.
[06:01.760 --> 06:04.120]  So it was a real successful thing.
[06:04.120 --> 06:06.960]  Since then, as I say, I joined Sektra
[06:06.960 --> 06:08.480]  as their sales manager.
[06:08.480 --> 06:10.040]  We had great success at the end
[06:10.040 --> 06:13.120]  of the national program around London.
[06:13.120 --> 06:17.800]  And I joined Soliton 10 years ago.
[06:17.800 --> 06:22.520]  Soliton are a radiology information system company.
[06:22.520 --> 06:27.520]  We sold our first risks into St. George's Tooting at 2012,
[06:27.840 --> 06:29.840]  the area that I came from.
[06:29.840 --> 06:34.720]  And basically today I'm the chief commercial officer.
[06:34.760 --> 06:39.520]  And obviously the AXRAM healthcare imaging IT
[06:39.520 --> 06:41.960]  special focus group chair.
[06:41.960 --> 06:46.080]  The little thing about me, I love AFC Wimbledon.
[06:46.080 --> 06:50.080]  We're going, as people know, hopefully up this year
[06:50.080 --> 06:52.640]  to league one, but the Wimbledon story
[06:52.640 --> 06:54.360]  and I've followed Wimbledon all my life.
[06:54.360 --> 06:56.640]  So it's been an amazing journeys
[06:56.640 --> 06:58.760]  and followed them since I was seven.
[06:58.760 --> 07:02.360]  And I'm also a keen golfer, although perhaps not as good
[07:02.840 --> 07:05.240]  as that, as I would like to be.
[07:05.240 --> 07:07.160]  So that's me really.
[07:07.160 --> 07:09.160]  And anyone who knows you Bob was always looking
[07:09.160 --> 07:12.040]  out for your golfing shirts to see what color
[07:12.040 --> 07:13.960]  you were in next time.
[07:13.960 --> 07:15.000]  Yes.
[07:15.000 --> 07:17.760]  Yes, I have about 90.
[07:17.760 --> 07:21.600]  So, and they're all color coded in the wardrobe.
[07:21.600 --> 07:25.240]  So yes, I'm always on the lookout
[07:25.240 --> 07:27.440]  for a new color that I haven't got,
[07:27.440 --> 07:29.680]  but I'm not sure it exists.
[07:29.680 --> 07:31.800]  Even Dulux don't have that many colors,
[07:31.800 --> 07:33.280]  but nevermind.
[07:33.280 --> 07:34.120]  Oh, wow.
[07:35.480 --> 07:37.000]  Excellent.
[07:37.000 --> 07:42.000]  Yes, but as I say, yeah, as you know, golf to anyone.
[07:44.040 --> 07:48.080]  My wife says it's men, sticks and balls.
[07:48.080 --> 07:49.640]  That's what she calls it.
[07:49.640 --> 07:54.640]  But I think for me, it's just a lovely break from work
[07:55.000 --> 07:59.000]  and everything else in a way that I like to relax.
[07:59.000 --> 08:01.160]  Yeah, and you have to switch off sometimes, don't you?
[08:01.160 --> 08:02.440]  Work can be so intense.
[08:02.440 --> 08:03.920]  You have to have that switch off.
[08:03.920 --> 08:05.640]  So thank you for sharing.
[08:05.640 --> 08:08.440]  I will say to you, and I think it's important,
[08:08.440 --> 08:12.560]  the radiology industry is amazing.
[08:13.400 --> 08:18.400]  The actual people that I've met in the 34 plus years
[08:18.400 --> 08:23.160]  that I've done it, the customers, the people I work with,
[08:23.160 --> 08:26.400]  their dedication to radiology.
[08:26.400 --> 08:30.640]  If people really understood, you know,
[08:30.640 --> 08:35.040]  I've had scenarios where hospitals out of action
[08:35.040 --> 08:38.960]  to people leaving Christmas day, whatever day,
[08:38.960 --> 08:43.960]  birthdays, weddings, to do work just to help colleagues.
[08:44.160 --> 08:47.760]  We are an amazing industry and I can't think
[08:47.760 --> 08:49.480]  of an industry that's like it.
[08:49.480 --> 08:54.200]  You know, I truly would call competitors friends
[08:54.200 --> 08:58.400]  because everyone tries to work together
[08:58.400 --> 09:00.640]  and it's such a good industry to work in.
[09:00.640 --> 09:02.920]  Hence why I've been here such a long time.
[09:02.920 --> 09:05.640]  Our industry is something phenomenal.
[09:06.680 --> 09:07.960]  Yeah, and I must say actually,
[09:07.960 --> 09:09.680]  from Sally and I's point of view,
[09:09.680 --> 09:11.520]  we say this often, don't we, Sally,
[09:11.520 --> 09:13.920]  that we are so lucky and privileged
[09:13.920 --> 09:15.120]  to have the members that we have.
[09:15.120 --> 09:19.320]  You make our jobs a million times easier than it should be.
[09:19.320 --> 09:21.040]  So as I say, from our point of view,
[09:21.040 --> 09:22.840]  thank you for that as well.
[09:22.840 --> 09:24.400]  Yeah, yeah.
[09:24.400 --> 09:25.560]  We're very lucky.
[09:25.880 --> 09:28.960]  We're very lucky to have such an engaged membership
[09:28.960 --> 09:31.920]  and we can tell in all the meetings that we host
[09:31.920 --> 09:35.600]  that members are in it for the greater good as well.
[09:35.600 --> 09:38.560]  You know, because the work that all our members do
[09:38.560 --> 09:40.400]  and the AXREM does indirectly
[09:40.400 --> 09:42.680]  can make a difference to patients.
[09:42.680 --> 09:45.080]  And I think that's what drives all of us.
[09:45.080 --> 09:47.760]  So I absolutely agree.
[09:47.760 --> 09:49.720]  And I think moving on now,
[09:49.720 --> 09:53.320]  I think that the listeners would really like to find out
[09:53.320 --> 09:55.960]  more about the Imaging IT Special Focus Group
[09:55.960 --> 09:57.880]  and what your priorities are.
[09:57.880 --> 10:00.360]  But I think we also have to talk about the manifesto.
[10:00.360 --> 10:03.920]  We obviously had an amazing launch of our manifesto
[10:03.920 --> 10:06.640]  at Bletchley Park recently.
[10:06.640 --> 10:09.080]  And it would just be good to kind of delve
[10:09.080 --> 10:10.760]  into some of the priorities
[10:10.760 --> 10:12.840]  within the Special Focus Group and the manifesto.
[10:12.840 --> 10:15.680]  So if okay, I'll go over to Bob first,
[10:15.680 --> 10:19.480]  just if you want to talk a bit about the priorities.
[10:19.480 --> 10:20.840]  Absolutely, thanks, Sally.
[10:20.840 --> 10:22.160]  I think you're absolutely right.
[10:22.160 --> 10:25.200]  And firstly, can I say what a wonderful event
[10:25.200 --> 10:29.480]  we had at Bletchley Park and the guests, the speakers,
[10:29.480 --> 10:31.160]  the people that attended.
[10:31.160 --> 10:33.800]  It was an amazing, amazing venue, an amazing day
[10:33.800 --> 10:38.280]  and probably one of the highlights of my career for sure.
[10:38.280 --> 10:40.520]  Probably the best way to sum it up
[10:40.520 --> 10:43.320]  is for the AXREM Imaging Manifesto
[10:43.320 --> 10:46.280]  is really why we've done this.
[10:46.280 --> 10:47.200]  And that was it.
[10:47.200 --> 10:50.280]  It was our vision was to be the leading voice
[10:50.280 --> 10:52.840]  in medical imaging and radiology.
[10:52.840 --> 10:55.680]  We want to drive innovation.
[10:55.680 --> 10:58.200]  We want to promote best practices
[10:58.200 --> 11:01.160]  and advocate for policies and support
[11:01.160 --> 11:03.400]  and the delivery of high quality
[11:03.400 --> 11:08.200]  and accessible imaging services to patients worldwide.
[11:08.200 --> 11:09.880]  And I think as a group,
[11:09.880 --> 11:13.400]  I think we really have strived to do that
[11:13.400 --> 11:18.400]  and strive to make the patient experience so much better.
[11:19.160 --> 11:22.880]  And what a wonderful group to work in
[11:22.880 --> 11:26.960]  with such great industry experts.
[11:26.960 --> 11:29.720]  The strength of us together is amazing.
[11:29.720 --> 11:33.680]  And I think that statement about where we led,
[11:33.680 --> 11:37.720]  we wanted this vision of being that leading voice.
[11:37.720 --> 11:41.040]  And I honestly believe over the last couple of years,
[11:41.040 --> 11:43.120]  AXREM have achieved that.
[11:43.120 --> 11:46.720]  And I honestly believe our AXREM Imaging Manifesto
[11:46.760 --> 11:48.920]  that we launched proved
[11:48.920 --> 11:52.040]  that we have actually carried out our vision.
[11:52.040 --> 11:57.040]  So it was a very proud moment and a really good day.
[11:59.120 --> 12:00.560]  I totally concur, Bob.
[12:00.560 --> 12:02.800]  And it was one of my career highlights as well.
[12:02.800 --> 12:07.600]  And I think seeing the level of attendee and speaker
[12:07.600 --> 12:11.320]  really does show how far AXREM has come.
[12:11.320 --> 12:14.040]  Saduf , did you want to share any insights
[12:14.040 --> 12:17.680]  about our manifesto launch or the manifesto itself?
[12:17.680 --> 12:20.520]  Yeah, I just wanna agree that first off,
[12:20.520 --> 12:22.040]  what a cracking day.
[12:22.040 --> 12:25.640]  Like the sun was shining, an amazing venue,
[12:25.640 --> 12:27.760]  fantastic speakers.
[12:27.760 --> 12:31.120]  And I think also for me, the level of engagement
[12:31.120 --> 12:33.040]  and the presence that we had from people
[12:33.040 --> 12:36.520]  just shows how committed and passionate
[12:36.520 --> 12:39.680]  this industry is about what we do, right?
[12:39.680 --> 12:41.760]  There wouldn't have been as many of us turning up
[12:41.760 --> 12:43.880]  and being a part of that launch
[12:44.720 --> 12:48.320]  if it wasn't so intrinsically important
[12:48.320 --> 12:50.360]  and embedded within all of us
[12:50.360 --> 12:52.880]  that we fully believed what was in that manifesto as well.
[12:52.880 --> 12:55.560]  And that manifesto really provides
[12:55.560 --> 12:58.760]  like a clear and future focused strategy
[12:58.760 --> 13:01.600]  for imaging and services across the UK.
[13:01.600 --> 13:03.760]  And for me, it really highlighted
[13:03.760 --> 13:08.160]  like how critical it is for all of us to come together
[13:08.160 --> 13:09.280]  as that collective voice
[13:09.280 --> 13:12.520]  for really transforming radiology IT.
[13:13.320 --> 13:15.280]  Some of the priorities that came out of that
[13:15.280 --> 13:20.080]  around cyber security, innovation and interoperability
[13:20.080 --> 13:24.120]  really show like we want to align industry
[13:24.120 --> 13:27.480]  and NHS and police policy bodies
[13:27.480 --> 13:31.680]  to almost serve as like a catalyst
[13:31.680 --> 13:34.960]  for delivering better and connected imaging services
[13:34.960 --> 13:38.760]  for patients and clinicians, right?
[13:38.760 --> 13:39.680]  I spoke at the beginning
[13:39.680 --> 13:41.880]  about where my background comes from.
[13:41.880 --> 13:44.000]  It's that passion that's driven
[13:44.000 --> 13:47.800]  like what we're pushing for in those priorities
[13:47.800 --> 13:49.840]  in that imaging IT manifesto.
[13:51.360 --> 13:52.280]  Yeah, and I think as well,
[13:52.280 --> 13:54.920]  when you look at all of the calls to action
[13:54.920 --> 13:57.880]  within the manifesto, I think we see a theme.
[13:57.880 --> 14:02.120]  So it's around bringing the IT community together.
[14:02.120 --> 14:05.040]  It's about healthcare providers and technology vendors
[14:05.040 --> 14:06.440]  collaborating closely
[14:06.440 --> 14:09.840]  and developing integrated imaging solutions
[14:09.840 --> 14:12.360]  around stakeholders adopting and adhering
[14:12.360 --> 14:14.960]  to establish interoperability standards,
[14:16.200 --> 14:18.800]  working with the NHS to provide access
[14:18.800 --> 14:20.720]  access with key contacts to the A's
[14:20.720 --> 14:23.000]  within the event of a cyber attack.
[14:23.000 --> 14:24.320]  When you look through the different,
[14:24.320 --> 14:25.640]  there's quite a few calls to actions
[14:25.640 --> 14:26.840]  and I would say to our listeners,
[14:26.840 --> 14:28.840]  please do read the document
[14:28.840 --> 14:30.720]  and we'd love to engage with you.
[14:30.720 --> 14:33.080]  If there's anything there on the calls to action
[14:33.080 --> 14:35.680]  that you feel you want to engage with us on
[14:35.680 --> 14:37.520]  or that actually you can help with.
[14:38.520 --> 14:40.440]  Well, I think Sally,
[14:40.440 --> 14:42.880]  it's probably a little bit more than that as well
[14:42.880 --> 14:46.240]  because what has come out of that since is that
[14:47.560 --> 14:51.880]  we've seen already some great feedback
[14:51.880 --> 14:56.360]  and being asked to attend various presentations
[14:56.360 --> 15:00.720]  or meetings around some of our calls for action.
[15:00.720 --> 15:03.360]  And I think it's really my job as chairman
[15:03.360 --> 15:06.680]  and our job as a group is to ensure
[15:06.960 --> 15:10.720]  that these calls for actions are followed up on
[15:10.720 --> 15:14.040]  and we treat them as our priorities
[15:14.040 --> 15:18.160]  because we honestly, when we set these calls for action,
[15:18.160 --> 15:21.840]  we honestly believe they would make such a major difference
[15:21.840 --> 15:26.840]  to not only our industry, but to the NHS and to patients.
[15:27.120 --> 15:28.920]  So I think our job is to make sure
[15:28.920 --> 15:32.160]  that these calls for action are followed through
[15:32.160 --> 15:36.400]  and we get the best result possible.
[15:36.960 --> 15:39.200]  And the calls to action within the paper
[15:39.200 --> 15:42.480]  are our priorities for the special focus groups.
[15:42.480 --> 15:45.200]  So obviously the manifesto is almost the easy bit.
[15:45.200 --> 15:48.360]  We've launched a manifesto, we've put out there what we want
[15:48.360 --> 15:50.360]  and now over the next year or two,
[15:50.360 --> 15:53.000]  we will then start working through those calls to action,
[15:53.000 --> 15:55.520]  reaching out to the right stakeholders
[15:55.520 --> 15:58.640]  and ensuring that we can not necessarily
[15:58.640 --> 16:00.400]  fix everything we're asking for,
[16:00.400 --> 16:02.400]  but work with the right stakeholders
[16:02.400 --> 16:05.240]  that have ownership of some of the things that we need
[16:05.240 --> 16:07.640]  to make sure that they see the industry perspective
[16:07.640 --> 16:11.160]  and some of the challenges that you guys might face.
[16:11.160 --> 16:12.000]  Yeah.
[16:12.000 --> 16:16.280]  I mean, a great one is the DPIA and DTAC processes
[16:16.280 --> 16:19.360]  and procedures, you know, if we can standardise that,
[16:19.360 --> 16:23.200]  it make life so much easier for all suppliers,
[16:23.200 --> 16:28.200]  all NHS staff, you know, the NHS, the NHS England,
[16:29.120 --> 16:33.000]  the HSC, it's a win-win for everybody.
[16:33.000 --> 16:35.240]  So, you know, that's a great example
[16:35.240 --> 16:39.600]  where we can save lots of red tape, lots of cost
[16:39.600 --> 16:43.600]  and that money can be passed back to the patient
[16:43.600 --> 16:47.600]  and people will have more time to deliver the services
[16:47.600 --> 16:50.560]  and concentrate on patient delivery
[16:50.560 --> 16:52.360]  rather than administration.
[16:53.320 --> 16:54.160]  Absolutely.
[16:54.160 --> 16:56.280]  And we've been working really hard on DTAC
[16:56.280 --> 16:58.880]  the last sort of 18 months to two years
[16:58.880 --> 17:02.240]  with NHS England, the Department of Health and Social Care
[17:02.240 --> 17:03.720]  and a full review's been done
[17:03.720 --> 17:05.920]  and we're actually gonna be doing some work over the summer.
[17:05.920 --> 17:08.320]  There's kind of some short, medium and long-term goals
[17:08.320 --> 17:10.360]  that have now been set in line
[17:10.360 --> 17:12.160]  with all the recommendations that we made.
[17:12.160 --> 17:15.960]  So that's a really positive thing to take forward.
[17:15.960 --> 17:18.320]  In the paper, in the manifesto,
[17:18.320 --> 17:21.800]  and obviously we've also got our cybersecurity working group,
[17:21.800 --> 17:25.480]  cybersecurity has become a big concern for our members.
[17:25.480 --> 17:28.520]  And I just wondered if you guys can explain why.
[17:28.520 --> 17:31.840]  So it's to do for you, happy to start on this one.
[17:31.840 --> 17:32.680]  Yeah, for sure.
[17:32.680 --> 17:36.600]  I mean, like healthcare organizations
[17:36.600 --> 17:40.400]  like the ones that we work with and our members,
[17:40.400 --> 17:45.400]  like the reason it's so, so high of a concern
[17:45.840 --> 17:49.120]  is essentially it's like high value targets
[17:49.120 --> 17:51.920]  with really high stakes consequences.
[17:51.920 --> 17:55.000]  Like lives are literally at stake
[17:55.000 --> 17:59.960]  if our hospitals are hit by big cybersecurity attacks.
[17:59.960 --> 18:04.400]  And we've already seen some of the consequences
[18:04.400 --> 18:06.120]  of those targeted attacks.
[18:07.240 --> 18:10.920]  There's been enough where emergency diagnostics
[18:10.920 --> 18:14.720]  have gone down, records can't be seen.
[18:14.720 --> 18:16.960]  You literally can't treat patients.
[18:16.960 --> 18:21.160]  And it's not just for the duration of that cyber attack.
[18:21.160 --> 18:24.240]  There's a massive knock-on effect that just compounds
[18:24.240 --> 18:29.120]  and adds to the pressures that our NHS is facing.
[18:29.120 --> 18:32.560]  So I think that's why, like that's the biggest reason
[18:32.560 --> 18:37.120]  why it's such a massive concern for us.
[18:37.120 --> 18:40.880]  And if we think away from the biggest concern
[18:40.880 --> 18:43.960]  which is patient lives, the cost of an attack
[18:43.960 --> 18:48.960]  on a publicly funded system like the NHS is devastating.
[18:50.800 --> 18:54.000]  Systems being down, it's not just recovering data
[18:54.000 --> 18:55.480]  but it's recovery of staff,
[18:55.480 --> 18:57.880]  it's clinical services that are canceled,
[18:57.880 --> 18:59.560]  it's the data that's stolen,
[18:59.560 --> 19:03.640]  it's the longer-term damage as well that's caused
[19:03.640 --> 19:08.640]  by cyber attacks on our sort of customers and our industry.
[19:12.200 --> 19:13.320]  Did you wanna add something Bob?
[19:13.320 --> 19:15.520]  Cause I know that this is like a big area
[19:15.520 --> 19:18.360]  that we worked on in the manifesto and-
[19:18.360 --> 19:19.200]  Absolutely.
[19:19.200 --> 19:24.040]  And again, I think maybe just to take an alled,
[19:24.040 --> 19:26.600]  something from outside our industry,
[19:26.600 --> 19:29.840]  if we look recently at the Marks and Spencer's
[19:29.840 --> 19:34.120]  and co-op cyber attack, they were saying that
[19:34.120 --> 19:39.120]  it's cost Marks and Spencer's 300 million in turnover
[19:39.400 --> 19:44.400]  and their online services won't be back to in full operation
[19:46.200 --> 19:48.480]  to probably the end of July.
[19:48.480 --> 19:53.200]  But if you take it also the credibility and the marketing
[19:53.200 --> 19:56.880]  and the loss of customers, et cetera,
[19:56.880 --> 19:59.040]  that cost will double or treble.
[19:59.040 --> 20:03.040]  You know, how many customers of Marks and Spencer's
[20:03.040 --> 20:06.320]  have now gone elsewhere to buy their goods
[20:06.320 --> 20:09.360]  and will they stay elsewhere with buying their goods?
[20:09.360 --> 20:13.920]  And it really is the same with healthcare in a way
[20:13.920 --> 20:18.880]  that you can see what damage it can do to a business.
[20:18.880 --> 20:20.520]  It's very scary.
[20:20.520 --> 20:21.840]  It affects everything.
[20:21.840 --> 20:26.000]  And, you know, we said in our manifesto that
[20:26.000 --> 20:30.840]  we believe cyber security really should be part
[20:30.840 --> 20:33.040]  of our national security.
[20:33.040 --> 20:36.520]  Our national security is at stake, as Sadhu said.
[20:36.520 --> 20:41.520]  If we can't carry out operations and perform, you know,
[20:42.440 --> 20:46.960]  all sorts of tests on patients, then in theory,
[20:46.960 --> 20:50.960]  you know, we really can't deliver healthcare.
[20:50.960 --> 20:54.520]  And that is a national security problem.
[20:54.520 --> 20:55.800]  Agreed.
[20:55.800 --> 21:00.800]  It's an amazing, you know, well, it could be catastrophic.
[21:01.320 --> 21:02.160]  It really could.
[21:02.160 --> 21:04.800]  And that's why we have to take these things serious.
[21:04.800 --> 21:07.320]  And, you know, if I look back just at some of the things
[21:07.320 --> 21:10.040]  AXREM has done, we've done quite a lot.
[21:10.040 --> 21:13.760]  And, you know, the access controls, you know,
[21:13.760 --> 21:16.080]  when you look at our MFA document,
[21:16.080 --> 21:19.400]  the multi-frontal authentication, you know,
[21:19.400 --> 21:24.240]  it's the great advice for suppliers and customers
[21:24.240 --> 21:28.480]  of what should be being used to enhance the security.
[21:28.480 --> 21:30.200]  You know, as somebody said,
[21:30.200 --> 21:34.480]  the criminals or us have to stay one step ahead.
[21:34.480 --> 21:37.360]  So, you know, what is good today
[21:37.360 --> 21:40.240]  may not be good in a month's time.
[21:40.240 --> 21:44.120]  So it's a continued battle to stay ahead of the criminals.
[21:44.160 --> 21:48.640]  And I have to say, I think everything that we have done
[21:48.640 --> 21:51.360]  and the things that are in the manifesto,
[21:51.360 --> 21:54.800]  the work that's being done by the cyber group,
[21:54.800 --> 21:58.320]  what we're trying to help the NHS do,
[21:58.320 --> 22:02.040]  I think is really the correct steps
[22:02.040 --> 22:03.840]  and the right direction of travel.
[22:03.840 --> 22:06.480]  And, you know, we've seen recently
[22:06.480 --> 22:11.480]  that the DHSC are issuing or National Cyber Centre
[22:12.480 --> 22:17.480]  issuing a mandate for cyber security and a charter
[22:18.200 --> 22:21.200]  and asking all suppliers to sign up for that.
[22:21.200 --> 22:23.560]  I think it's a really great step forward.
[22:23.560 --> 22:26.840]  And again, I'm very proud to say that AXREM
[22:26.840 --> 22:29.200]  were one of the people that called for that
[22:29.200 --> 22:31.240]  and called for that to be done.
[22:31.240 --> 22:33.920]  And yeah, it's being done.
[22:33.920 --> 22:37.000]  So there's been some really good work
[22:37.000 --> 22:39.160]  by all the AXREM members.
[22:39.200 --> 22:42.400]  Yeah, I must at this point point our listeners
[22:42.400 --> 22:47.400]  to our strengthening cyber security in the UK healthcare,
[22:47.520 --> 22:51.240]  a strategic approach for continuity and resilience paper
[22:51.240 --> 22:53.360]  that we published a few months ago.
[22:53.360 --> 22:54.840]  And in there, we actually do state
[22:54.840 --> 22:57.040]  that the healthcare IT sector
[22:57.040 --> 22:59.280]  with its complex digital infrastructure
[22:59.280 --> 23:02.080]  and critical service nature stands as a prime target
[23:02.080 --> 23:04.760]  in the evolving cyber threat landscape,
[23:04.760 --> 23:07.680]  offering numerous entry points for exploitation
[23:07.680 --> 23:09.840]  by malicious actors.
[23:09.840 --> 23:13.160]  So we know, like you say, Bob, we've seen big companies
[23:13.160 --> 23:16.240]  and obviously if the NHS got hit with a big cyber attack,
[23:16.240 --> 23:19.880]  that is obviously newsworthy around the world,
[23:19.880 --> 23:21.760]  which is what these people want.
[23:21.760 --> 23:25.880]  But critically, our patients are gonna massively suffer.
[23:25.880 --> 23:28.440]  So like you say, the work we're doing in this area
[23:28.440 --> 23:29.280]  is really important.
[23:29.280 --> 23:31.680]  And it isn't just these papers,
[23:31.680 --> 23:33.360]  it's in our strategic priorities,
[23:33.360 --> 23:35.120]  overarching the whole organization
[23:35.120 --> 23:37.760]  because almost every medical device
[23:37.760 --> 23:42.200]  that our members provide are in an IT system.
[23:42.200 --> 23:45.520]  So it would be affected by a cyber attack.
[23:45.520 --> 23:46.360]  Yes. Yeah.
[23:46.360 --> 23:48.000]  And I think like you say there, Sally,
[23:48.000 --> 23:53.000]  like we know that through our own experiences and vendors,
[23:54.360 --> 23:57.080]  we know that many NHS systems still run
[23:57.080 --> 24:00.000]  on outdated infrastructure.
[24:00.000 --> 24:02.880]  There's poor patching, security practices
[24:02.880 --> 24:05.400]  aren't really where they should be,
[24:05.400 --> 24:08.080]  which is what makes them easier to exploit.
[24:08.080 --> 24:10.240]  It also means they're harder to defend.
[24:10.240 --> 24:13.360]  And as Bob said, it means that we're then slower
[24:13.360 --> 24:15.040]  to recover from an incident.
[24:15.040 --> 24:18.040]  I don't know if you remember like the WannaCry attack
[24:18.040 --> 24:22.320]  in 2017, Bob, you were just talking there about
[24:22.320 --> 24:27.080]  quite a short attack and how much that's cost M&S.
[24:27.080 --> 24:29.520]  I remember the WannaCry attack,
[24:29.520 --> 24:32.560]  I think cost the NHS something like 90 something
[24:32.560 --> 24:37.560]  million pounds and we lost over 20,000 appointments.
[24:38.120 --> 24:41.720]  Like that's just mad, right?
[24:41.720 --> 24:44.240]  And it's why I think we're all so passionate
[24:44.240 --> 24:47.080]  about cybersecurity and why it's one of the biggest
[24:47.080 --> 24:49.560]  priorities on our manifesto.
[24:50.760 --> 24:51.600]  Yes.
[24:51.600 --> 24:55.680]  And I'd just like to give a real life example.
[24:56.560 --> 25:01.400]  I met with a radiologist who was at Guy's and Tommy's
[25:01.400 --> 25:06.400]  after four days of the cyber, the latest cyber attack.
[25:06.800 --> 25:10.200]  And she was in tears, absolute tears
[25:10.200 --> 25:13.440]  because she couldn't perform her duties.
[25:13.440 --> 25:16.840]  And everyone was running around,
[25:16.840 --> 25:21.280]  everyone was trying to do their best to keep things moving.
[25:21.920 --> 25:26.920]  The effect on healthcare professionals is just immense.
[25:28.320 --> 25:32.760]  And these people are so proud of what they do.
[25:32.760 --> 25:37.480]  And when they're not able to do it, it's so personal.
[25:37.480 --> 25:42.480]  And it affected all of us why that was going on
[25:42.840 --> 25:45.680]  but it affects you because at the end of the day
[25:45.680 --> 25:47.680]  it could be a member of your family
[25:47.680 --> 25:49.800]  that's not getting treated.
[25:49.800 --> 25:52.920]  So I think, as I say, it's very easy to get lost
[25:52.920 --> 25:57.080]  in cybersecurity that it's a technical thing, but it's not.
[25:57.080 --> 25:58.240]  It's a patient.
[25:58.240 --> 26:01.200]  Remember there's a patient at the end of this
[26:01.200 --> 26:04.600]  and it could be a member of your family.
[26:04.600 --> 26:06.480]  So it's so important.
[26:06.480 --> 26:08.480]  And we're all patients at the end of the day, aren't we?
[26:08.480 --> 26:11.520]  So it is important for all of us.
[26:11.520 --> 26:13.080]  So obviously we've kind of touched on
[26:13.080 --> 26:15.360]  what we've been doing so far,
[26:15.360 --> 26:16.920]  what we're doing at the moment.
[26:16.920 --> 26:19.440]  So obviously kind of like looking further ahead
[26:20.080 --> 26:22.480]  and horizon scanning, what's next for the SFG?
[26:22.480 --> 26:24.960]  I know we've kind of touched on the manifesto
[26:24.960 --> 26:28.560]  calls to action will form kind of our priorities as such,
[26:28.560 --> 26:30.160]  but what else?
[26:30.160 --> 26:31.960]  I don't know if you want to start off.
[26:31.960 --> 26:32.880]  Yes, certainly.
[26:32.880 --> 26:37.880]  So at the moment we've seen a number of frameworks
[26:38.760 --> 26:40.360]  due for renewal.
[26:40.360 --> 26:43.840]  So there are some renewals of frameworks this year
[26:43.840 --> 26:47.680]  which all the suppliers are working on and AXREM.
[26:47.680 --> 26:52.240]  We've made, you could say as a group,
[26:53.240 --> 26:56.520]  some changes to the perceived T's and C's
[26:56.520 --> 26:58.480]  or requested some changes.
[26:58.480 --> 27:01.800]  So there's some good work being done in that area
[27:01.800 --> 27:04.400]  and that obviously will continue.
[27:04.400 --> 27:08.600]  I think there is a big concern at the moment
[27:08.600 --> 27:11.680]  around integration with other systems
[27:11.680 --> 27:13.920]  and standards being used.
[27:13.920 --> 27:17.600]  And of course the big one is artificial intelligence.
[27:18.520 --> 27:19.840]  And how we integrate
[27:19.840 --> 27:22.800]  and how artificial intelligence is used.
[27:22.800 --> 27:27.320]  So I think that will be a very big sort of discussion point
[27:27.320 --> 27:32.320]  in the coming year or so as we see more and more,
[27:33.240 --> 27:35.920]  you could say integrations of AI
[27:35.920 --> 27:39.960]  into our current systems within the NHS.
[27:39.960 --> 27:43.800]  So I suppose to me, they are the two major ones,
[27:43.800 --> 27:47.320]  but of course, whilst you're doing that,
[27:48.240 --> 27:50.360]  is also look backwards.
[27:50.360 --> 27:52.000]  We never know what's around the corner
[27:52.000 --> 27:55.000]  and of course the one that comes to mind was COVID.
[27:56.000 --> 27:59.160]  If COVID or something similar happened again,
[27:59.160 --> 28:01.800]  are we in a better position?
[28:01.800 --> 28:05.000]  And I would say one thing we have learned is
[28:05.000 --> 28:08.880]  whatever we need to do, I believe we're actually ready now.
[28:08.880 --> 28:13.280]  We're organised, we've got some really good things in place
[28:13.280 --> 28:18.280]  as Axram to be able to help each other
[28:19.080 --> 28:23.000]  as well as the NHS in the event of an incident
[28:23.000 --> 28:24.600]  or something similar to that.
[28:24.600 --> 28:27.080]  And I think we've learned a lot from that.
[28:27.080 --> 28:32.080]  And I wanna make sure those, you could say actions
[28:32.200 --> 28:35.560]  are continually looked at and improved
[28:35.560 --> 28:39.840]  so that if we do ever see anything similar to that again,
[28:39.840 --> 28:44.360]  that we are ready and willing to do whatever we can,
[28:44.360 --> 28:46.880]  even though it may be small,
[28:46.880 --> 28:50.840]  to help everyone get through the next whatever is thrown at us.
[28:52.160 --> 28:54.600]  And Saduf, have you got anything to add to that?
[28:54.600 --> 28:59.040]  Yeah, I think AI is a major, major topic that's coming up.
[28:59.040 --> 29:02.200]  And I think Bob makes a great point there around
[29:02.200 --> 29:05.080]  how does that integrate interoperability,
[29:05.080 --> 29:07.120]  bringing AI into systems.
[29:07.200 --> 29:10.280]  But I think for me as well, I want to,
[29:10.280 --> 29:11.640]  like one of the priorities I think
[29:11.640 --> 29:14.680]  like the imaging IT focus group could look at
[29:14.680 --> 29:18.800]  is around defining and driving ethical AI adoption.
[29:18.800 --> 29:21.240]  How do we shape policy on validation?
[29:21.240 --> 29:24.320]  How do we shape deployment and governance, right?
[29:24.320 --> 29:28.760]  I'd love to see the group working with MHRA, with NICE,
[29:28.760 --> 29:32.120]  with the NHS AI lab to really like start
[29:32.120 --> 29:34.600]  to align industry offerings
[29:34.600 --> 29:38.280]  and make sure that they're compliant and meet regulations
[29:38.280 --> 29:39.840]  that we've had some input on.
[29:41.200 --> 29:43.680]  And then following on from that,
[29:43.680 --> 29:45.760]  like another one that I'd love to look at
[29:45.760 --> 29:50.160]  because it's so closely related to AI,
[29:50.160 --> 29:53.440]  is around the ethical use of patient imaging data.
[29:53.440 --> 29:58.200]  We're generating petabytes of data
[29:58.200 --> 30:00.720]  compared to what we used to generate, right?
[30:00.720 --> 30:03.040]  More data than ever before.
[30:03.040 --> 30:07.000]  And we know like it's being used in things like research
[30:07.000 --> 30:09.800]  and it's being used in AI training models.
[30:10.680 --> 30:13.800]  But I do wonder if like the focus group could help
[30:13.800 --> 30:16.640]  towards creating things like best practice frameworks
[30:16.640 --> 30:18.880]  for things like data sovereignty,
[30:18.880 --> 30:23.280]  for consent around transparency of use.
[30:23.280 --> 30:25.080]  You know, it's not just around,
[30:26.000 --> 30:30.240]  I want the imaging IT focus group to almost look beyond
[30:30.240 --> 30:34.120]  just what we're doing today with imaging and AI, right?
[30:34.120 --> 30:37.720]  And yes, integration is absolutely key,
[30:37.720 --> 30:40.120]  but I feel like the work that we've done in the past
[30:40.120 --> 30:42.640]  with interoperability means that we're probably
[30:42.640 --> 30:44.600]  a bit more ahead of where we thought we might be
[30:44.600 --> 30:48.160]  with AI integrations, but it's all the other side of it.
[30:48.160 --> 30:50.200]  Like, how do we use it safely?
[30:50.200 --> 30:53.600]  How do we make sure we're using it in the right way?
[30:53.600 --> 30:56.160]  You know, can we look at using it
[30:56.160 --> 30:58.600]  in a way that reduces burnout?
[30:58.600 --> 31:00.440]  Can we support productivity tools?
[31:00.440 --> 31:03.600]  Can we, you know, streamline the way that we use it?
[31:05.000 --> 31:09.360]  I think for me, the group should and could
[31:09.360 --> 31:12.400]  show real leadership in this space
[31:12.400 --> 31:16.320]  to drive like trustworthy innovation
[31:16.320 --> 31:19.160]  and responsible innovation.
[31:20.000 --> 31:24.400]  Yeah, and what's really nice is how well the image IT
[31:24.400 --> 31:27.040]  and the AI group work so closely together,
[31:27.080 --> 31:29.840]  having our joint meeting every year is just crucial,
[31:29.840 --> 31:30.680]  isn't it?
[31:30.680 --> 31:33.400]  So we can have those crossover conversations as well.
[31:33.400 --> 31:34.240]  So no, absolutely.
[31:34.240 --> 31:35.720]  And we've got the experts, so why not?
[31:35.720 --> 31:37.680]  If people are willing to,
[31:37.680 --> 31:39.880]  and we're able to do things like that,
[31:39.880 --> 31:41.880]  then absolutely, let's get ahead of the game
[31:41.880 --> 31:43.920]  and put this in place.
[31:43.920 --> 31:46.000]  Yeah, and just to say as well,
[31:46.000 --> 31:47.400]  to point out to our listeners
[31:47.400 --> 31:51.000]  that we do have a separate AI special focus group.
[31:51.000 --> 31:52.600]  So obviously there is a lot of crossover
[31:52.600 --> 31:55.200]  between imaging IT and AI,
[31:55.240 --> 31:58.120]  and there's some fantastic work being done by the AI group,
[31:58.120 --> 31:59.760]  which will be on another podcast.
[31:59.760 --> 32:02.720]  We're having an AI podcast as well.
[32:02.720 --> 32:05.560]  And we also lead on an AI think tank
[32:05.560 --> 32:08.160]  where you mentioned some of those organizations
[32:08.160 --> 32:12.560]  to do NHS England, MHRA, NICE.
[32:12.560 --> 32:14.440]  We engage with a lot of the societies
[32:14.440 --> 32:15.840]  and royal colleges within our area
[32:15.840 --> 32:19.200]  and bring all of those key stakeholders together for AI
[32:19.200 --> 32:21.520]  to really kind of try and align our messaging
[32:21.520 --> 32:25.000]  and align our work, so also not duplicate effort.
[32:25.800 --> 32:28.560]  So I think some of that stuff is already being done,
[32:28.560 --> 32:31.480]  and yes, we've got further, obviously, to go with it.
[32:33.000 --> 32:38.000]  I also think, Saduf , that, as I say, the joint meeting,
[32:41.120 --> 32:45.640]  this will be a great opportunity to raise those points
[32:45.640 --> 32:49.880]  and discuss them and say, where do they actually sit?
[32:49.880 --> 32:52.280]  And I completely concur with you with it.
[32:52.280 --> 32:54.080]  I think you're absolutely right,
[32:54.080 --> 32:56.520]  we need to ensure that whichever group it is
[32:56.520 --> 32:59.280]  is doing this, that these points are covered,
[32:59.280 --> 33:03.440]  because I think everyone benefits from what you said.
[33:03.440 --> 33:05.200]  Absolutely everyone benefits,
[33:05.200 --> 33:07.960]  and I totally agree with you.
[33:07.960 --> 33:10.560]  We need to ensure these points are covered,
[33:10.560 --> 33:14.040]  and it doesn't matter which group or a combination of groups,
[33:14.040 --> 33:16.040]  it just, we just need to make sure
[33:16.040 --> 33:19.960]  that these things are taken forward and are addressed.
[33:20.960 --> 33:22.600]  Yeah, absolutely.
[33:22.600 --> 33:26.320]  Now, just changing the tone up a little bit of the podcast,
[33:26.320 --> 33:30.040]  I am obviously biased,
[33:30.040 --> 33:32.320]  but I think that AXREM does a great job.
[33:34.160 --> 33:36.640]  I think when you look at where we were six years ago
[33:36.640 --> 33:37.560]  and where we are today,
[33:37.560 --> 33:40.520]  we are a completely different organization,
[33:40.520 --> 33:42.000]  but I'm bound to say that.
[33:42.000 --> 33:44.200]  So my question to you guys is,
[33:44.200 --> 33:46.320]  what do you think are the benefits
[33:46.320 --> 33:48.520]  of being involved in AXREM?
[33:48.720 --> 33:50.800]  You think you're lying out your teeth.
[33:50.800 --> 33:52.280]  Totally lying.
[33:52.280 --> 33:54.000]  Where is this coming from?
[33:54.000 --> 33:56.040]  What do you mean, good work you've done?
[33:57.360 --> 34:00.520]  I'll let you take over there next to me.
[34:00.520 --> 34:04.080]  You know, I absolutely agree, honestly.
[34:04.080 --> 34:08.000]  I, if I've not been vocal enough about the change
[34:08.000 --> 34:12.200]  that the current team and Sally yourself have spearheaded,
[34:12.200 --> 34:17.200]  like Axrem to me used to be this thing that, you know,
[34:17.240 --> 34:19.680]  my procurement team or my, you know,
[34:19.680 --> 34:21.520]  my marketing team would be like,
[34:21.520 --> 34:22.800]  do you want to sign this off?
[34:22.800 --> 34:23.800]  And I'd be like, yeah, sure.
[34:23.800 --> 34:25.640]  I don't really know what I do with it.
[34:25.640 --> 34:29.000]  I just sit there, but that was years ago.
[34:29.000 --> 34:33.480]  And now it's an absolute must spend for me.
[34:33.480 --> 34:36.120]  Like one, you get to be in an awesome group
[34:36.120 --> 34:38.600]  and you get to be on wicked podcasts like this.
[34:39.480 --> 34:43.120]  But for me, like the voice that you get
[34:43.120 --> 34:45.880]  and the fact that you actually get to have a say
[34:45.880 --> 34:50.840]  in shaping things like national imaging policy and standards,
[34:50.840 --> 34:54.960]  you actually get a chance to be part of the process
[34:54.960 --> 34:57.880]  and advocate for the change you want to see
[34:57.880 --> 35:00.240]  rather than just having to deal with the change
[35:00.240 --> 35:02.040]  that's been thrust upon you.
[35:02.040 --> 35:05.080]  And you've been fundamental to that, Sally.
[35:05.080 --> 35:08.760]  You and the team have just been so key
[35:08.760 --> 35:12.360]  in making sure our industry voice is heard
[35:12.360 --> 35:15.560]  in national decisions and helping define the rules
[35:15.560 --> 35:17.000]  that shape our market.
[35:17.000 --> 35:19.240]  Like that's one of the biggest benefits for me
[35:19.240 --> 35:21.600]  out of being with Axrem.
[35:21.600 --> 35:22.600]  The fact that you can then...
[35:22.600 --> 35:24.080]  Can I just add to that now,
[35:24.080 --> 35:26.520]  that I am going to be able to get out of the room
[35:26.520 --> 35:27.880]  that I'm currently sitting in,
[35:27.880 --> 35:30.480]  where it's going to be something.
[35:30.480 --> 35:31.320]  I said Sally.
[35:31.320 --> 35:32.280]  No, did I mean Sally?
[35:32.280 --> 35:33.960]  I meant Mel and Naomi.
[35:35.480 --> 35:38.200]  I literally feel like this is one of those questions
[35:38.200 --> 35:40.600]  where we're like, and it's genuinely not.
[35:40.600 --> 35:43.880]  I think it's just good for listeners
[35:43.880 --> 35:47.760]  and companies that maybe have watched us from afar
[35:47.760 --> 35:49.760]  and might be considering membership.
[35:49.760 --> 35:52.200]  Oh, don't pretend, Sally, you just want to hear it.
[35:52.200 --> 35:54.480]  I'll take my five and let you.
[35:54.480 --> 35:57.800]  I'll come in there.
[35:57.800 --> 35:58.640]  I'll come in there.
[35:58.640 --> 36:00.640]  He's going to bring it back down to earth.
[36:03.000 --> 36:06.280]  I've been in the industry an awful long time
[36:06.280 --> 36:08.200]  and been on Axrem a lot longer
[36:08.200 --> 36:10.440]  than everybody else on this call.
[36:11.360 --> 36:13.560]  I've seen the real change.
[36:14.560 --> 36:16.880]  It's amazing what we've achieved
[36:16.880 --> 36:21.880]  in the last four or five years, whichever it's been.
[36:22.320 --> 36:24.480]  It's just been an amazing change.
[36:24.480 --> 36:27.000]  I mean, some of the highlights for me
[36:27.000 --> 36:29.600]  were the Future Leaders Council.
[36:29.600 --> 36:32.920]  I remember going to my first Axrem meeting
[36:32.920 --> 36:34.920]  and being scared to speak.
[36:34.920 --> 36:37.760]  Now, when you look at our executive,
[36:37.760 --> 36:40.080]  every time we go there, there's new members.
[36:40.080 --> 36:42.680]  Everyone gets a fair chance to speak.
[36:42.680 --> 36:47.080]  Everyone is looked at as just as important,
[36:47.080 --> 36:50.960]  whether you're a small company, a large corporate,
[36:50.960 --> 36:55.960]  whatever you are, everyone is given the same opportunity
[36:56.920 --> 36:58.960]  and everyone is listening to,
[36:58.960 --> 37:03.960]  and everyone has the same respect for everybody else.
[37:04.840 --> 37:07.120]  I think, as I said, it's this camaraderie
[37:07.120 --> 37:09.840]  within our industry that also helps.
[37:09.880 --> 37:14.080]  But the real benefit of it is the amount of knowledge
[37:14.080 --> 37:18.160]  and information that we have shared.
[37:18.160 --> 37:20.640]  It is an amazing thing.
[37:20.640 --> 37:21.760]  It really is.
[37:21.760 --> 37:24.680]  And I would encourage anyone to join it.
[37:24.680 --> 37:28.360]  All the things that you've said, you're absolutely right.
[37:28.360 --> 37:31.880]  The events we've done, the thing,
[37:31.880 --> 37:34.400]  it's just been an incredible journey.
[37:34.400 --> 37:37.680]  And it's down to Sally and the team.
[37:37.680 --> 37:39.200]  Of course, you do it.
[37:39.200 --> 37:42.440]  As you said earlier on in this podcast,
[37:42.440 --> 37:46.040]  it was down to also the members and giving up their time
[37:46.040 --> 37:48.560]  to actually make this work.
[37:48.560 --> 37:51.480]  And I have to congratulate everyone
[37:51.480 --> 37:54.000]  because I would just say to anyone,
[37:54.000 --> 37:59.000]  if you want to be part of this business, join Axrem,
[38:00.000 --> 38:02.640]  get in the right special focus group.
[38:02.640 --> 38:06.000]  You have a great opportunity to meet colleagues,
[38:06.000 --> 38:08.360]  meet people in your industry.
[38:08.360 --> 38:13.040]  And more importantly, anything, hear what's being said,
[38:13.040 --> 38:15.920]  learn from it, taking that information.
[38:15.920 --> 38:20.920]  It's very valuable to your company within your industry.
[38:21.040 --> 38:23.640]  So I would encourage everyone.
[38:23.640 --> 38:27.080]  And just the last thing from a personal point of view,
[38:27.080 --> 38:31.520]  it's raised my profile dramatically.
[38:31.520 --> 38:33.880]  When you look at it, everyone knows you.
[38:33.880 --> 38:35.960]  Everyone wants to talk to you.
[38:36.920 --> 38:40.160]  Members want to share things with other members.
[38:40.160 --> 38:44.040]  It's been, as I said, it's created camaraderie.
[38:44.040 --> 38:45.840]  And I have to say, as I said,
[38:45.840 --> 38:47.920]  at the beginning of this podcast,
[38:47.920 --> 38:50.200]  we work in the very best industry
[38:50.200 --> 38:52.920]  and it's credit to everyone who works in it.
[38:52.920 --> 38:57.920]  There's not many places where you can actually collaborate
[38:58.440 --> 39:03.440]  in a meaningful way with your competition and your peers.
[39:04.400 --> 39:08.120]  On challenges that affect all of us.
[39:08.120 --> 39:13.120]  There's not a platform that you can have that honesty
[39:13.400 --> 39:17.840]  and openness and transparency all together
[39:17.840 --> 39:20.840]  for what essentially is the greater good,
[39:20.840 --> 39:22.480]  which is our patience.
[39:23.800 --> 39:24.880]  I just want to add to that.
[39:24.880 --> 39:26.720]  For me, it is a team effort.
[39:26.720 --> 39:30.640]  It's all about the people, our members, the staff team,
[39:30.640 --> 39:31.600]  our stakeholders.
[39:31.600 --> 39:33.200]  We are one big community.
[39:34.160 --> 39:36.800]  And when I say this to people, it's so cheesy,
[39:36.800 --> 39:38.720]  but we are like one big happy family.
[39:38.720 --> 39:40.320]  I do feel like that.
[39:40.320 --> 39:43.760]  And I would say that if I had to explain
[39:43.760 --> 39:46.400]  or describe AXREM in six words,
[39:46.400 --> 39:50.040]  it would be open, transparent, inclusive, informative,
[39:50.040 --> 39:52.120]  engaging, and collaborative.
[39:52.120 --> 39:56.880]  And what a fantastic six words just to sum it up.
[39:56.880 --> 39:59.000]  And I do think that's what we've become.
[39:59.000 --> 40:01.440]  So I'm now going to hand over to Mel,
[40:01.440 --> 40:03.880]  who is going to completely mix things up.
[40:03.880 --> 40:04.840]  Mix it up.
[40:04.840 --> 40:06.000]  Just before we move on,
[40:06.000 --> 40:08.320]  I was just trying to work out, actually, Bob,
[40:08.320 --> 40:10.400]  you mentioned how many years you were working
[40:10.400 --> 40:11.720]  at that business, this business.
[40:11.720 --> 40:12.960]  And then obviously you just mentioned
[40:12.960 --> 40:15.240]  about your time in AXREM.
[40:15.240 --> 40:16.640]  I thought you were only 25.
[40:17.760 --> 40:19.560]  I was just going to say how many years, man.
[40:19.560 --> 40:20.400]  Yeah.
[40:20.400 --> 40:21.240]  Yeah.
[40:21.240 --> 40:25.600]  I'm like the queen, though.
[40:25.600 --> 40:28.520]  She had two birthdays, I've won every two years.
[40:29.480 --> 40:30.680]  I was just going to say,
[40:30.680 --> 40:34.520]  that's more of Bob's mental capacity, 25,
[40:34.520 --> 40:37.440]  when he's the last one standing at 4 a.m.
[40:37.440 --> 40:38.760]  when we're at UKIO.
[40:40.200 --> 40:41.040]  Do you know what?
[40:41.040 --> 40:43.880]  It's all them years of hardcore socializing
[40:43.880 --> 40:45.000]  that's done it, right?
[40:45.000 --> 40:47.440]  It's just, I feel like, yeah,
[40:47.440 --> 40:49.360]  you've found the secret to youth
[40:49.360 --> 40:52.000]  or some sort of invisibility spell,
[40:52.000 --> 40:53.720]  invincibility spell, even.
[40:55.480 --> 40:57.640]  I'd wish to take the credit for it,
[40:57.720 --> 40:58.760]  but I can't.
[40:58.760 --> 41:01.200]  It's because I think that I'm married
[41:01.200 --> 41:03.840]  to such a lovely wife who's T-total.
[41:05.160 --> 41:08.360]  And I think the problem is with it, as you said,
[41:08.360 --> 41:12.520]  I enjoy the company of all the people that I work with.
[41:12.520 --> 41:15.360]  And it's customers, it's, as you say,
[41:15.360 --> 41:17.120]  work colleagues, et cetera.
[41:17.120 --> 41:19.080]  People think, I think, in our industry
[41:19.080 --> 41:21.400]  that we get together all the time.
[41:21.400 --> 41:24.960]  If I look at UKIO as a great example,
[41:24.960 --> 41:29.640]  there's people who go to UKIO from my own company
[41:29.640 --> 41:31.920]  who I see perhaps twice a year.
[41:33.000 --> 41:35.520]  And that, I think, sums it up.
[41:35.520 --> 41:38.760]  It's one place where you see customers.
[41:38.760 --> 41:42.280]  You think you see customers every month,
[41:42.280 --> 41:43.760]  every six months.
[41:43.760 --> 41:44.600]  You don't.
[41:44.600 --> 41:48.320]  Sometimes you don't see them from one UKIO to another
[41:48.320 --> 41:50.560]  because we are very busy people.
[41:50.560 --> 41:54.240]  So I think, for me, as I say, yeah,
[41:54.480 --> 41:55.320]  it's great.
[41:55.320 --> 41:57.240]  But if you've got that environment
[41:57.240 --> 42:00.080]  and you've got that chance to be with everyone,
[42:00.080 --> 42:02.440]  then you try to make the maximum of it.
[42:02.440 --> 42:05.560]  And I think that's what I try and do.
[42:05.560 --> 42:10.560]  You are the life and soul and the heartbeat behind a lot of,
[42:10.560 --> 42:13.880]  you know, it's members like you that help gel everyone together.
[42:13.880 --> 42:15.960]  It's like the glue that kind of holds everyone together.
[42:15.960 --> 42:19.080]  And you being the life and soul is what does that.
[42:19.080 --> 42:22.480]  So, right, over to Mel for our quirky questions.
[42:22.640 --> 42:24.640]  So, yeah, my quirky question time.
[42:24.640 --> 42:28.320]  So if you had a time machine, would you go to the future
[42:28.320 --> 42:30.560]  or would you go back in the past and why?
[42:30.560 --> 42:33.200]  So, Saduf , do you want to start with that one?
[42:33.200 --> 42:34.040]  Oh, crikey.
[42:36.400 --> 42:40.400]  I think, I think I'd go back.
[42:41.360 --> 42:43.960]  I think I'd go back to the past.
[42:43.960 --> 42:45.480]  How far in the past?
[42:46.520 --> 42:48.360]  Just my own past.
[42:48.360 --> 42:49.760]  Oh, okay.
[42:49.760 --> 42:53.640]  I want to go back and have a do-over of certain things,
[42:53.640 --> 42:57.360]  but also I feel like I would advocate harder
[42:57.360 --> 42:59.000]  for change in certain areas
[42:59.000 --> 43:00.760]  so that the future could be better.
[43:01.680 --> 43:04.160]  So I don't want to go back in like a regretful way.
[43:04.160 --> 43:08.240]  I don't want to go back to, you know, too far.
[43:08.240 --> 43:11.480]  I just want to go back and use what I know now
[43:11.480 --> 43:13.680]  to set better foundations for the future
[43:13.680 --> 43:17.720]  and like get the chance to fix what could have been different
[43:17.720 --> 43:21.720]  and maybe, maybe just create like that butterfly effect
[43:21.720 --> 43:24.680]  or that ripple effect that would improve my world
[43:24.680 --> 43:27.600]  and the one that everyone then inherits from us.
[43:27.600 --> 43:32.000]  I think that could like be so powerful.
[43:32.000 --> 43:32.840]  Right?
[43:32.840 --> 43:33.680]  Good answer.
[43:33.680 --> 43:35.040]  I like it.
[43:35.040 --> 43:35.880]  Excellent.
[43:35.880 --> 43:36.880]  And Bob, what about you?
[43:38.000 --> 43:40.520]  Well, what a question.
[43:40.520 --> 43:45.360]  And firstly, I'm going to say I can't answer
[43:45.360 --> 43:47.840]  if I want to go back to 1988
[43:47.840 --> 43:51.320]  and have cup final day again when we won the cup
[43:51.320 --> 43:53.720]  or go forward to a number of years
[43:53.720 --> 43:55.480]  and we win the Champions League.
[43:55.480 --> 43:59.160]  So I'm not sure about that, but yeah.
[43:59.160 --> 44:02.760]  But no, I would say I'd love to go back.
[44:02.760 --> 44:05.880]  I'd love to go back and do it again.
[44:05.880 --> 44:10.120]  I think I would love to say, yeah,
[44:10.120 --> 44:12.560]  to pick it up what we just laughed about.
[44:12.560 --> 44:14.880]  I'd love to be 25 again.
[44:15.360 --> 44:18.360]  And then have that opportunity with what I've learned
[44:18.360 --> 44:20.680]  to take that forward again
[44:20.680 --> 44:23.880]  with all these great new technologies like AI
[44:23.880 --> 44:26.840]  and all the things that we've been talking about.
[44:26.840 --> 44:30.640]  I think we're into, we could make such a difference.
[44:30.640 --> 44:32.880]  You know, I look back in my career
[44:32.880 --> 44:34.040]  and as I said earlier,
[44:34.040 --> 44:38.040]  the film to digital was a major thing for this country.
[44:38.040 --> 44:40.080]  And that was a real game changer.
[44:40.080 --> 44:43.560]  I think we're at that point now with the technology.
[44:43.560 --> 44:45.920]  I think we've got such wonderful technology.
[44:45.920 --> 44:47.640]  You know, people are living longer.
[44:47.640 --> 44:50.440]  That's because of the way we treat people
[44:50.440 --> 44:52.200]  and our healthcare system.
[44:52.200 --> 44:57.200]  And I think, again, we've added all this new technology.
[44:57.200 --> 45:00.560]  You know, I expect to live to 150.
[45:00.560 --> 45:01.800]  So there you go.
[45:01.800 --> 45:03.320]  That's why I'd like to go back.
[45:03.320 --> 45:07.200]  But yeah, I think that's probably the best thing.
[45:07.200 --> 45:09.680]  But I think there's a great opportunity
[45:09.720 --> 45:14.280]  for, you know, all of our future leaders
[45:14.280 --> 45:17.600]  to take what we've created and take this forward.
[45:17.600 --> 45:20.760]  I'd love to come back, maybe, and that's why I say,
[45:20.760 --> 45:23.120]  maybe to come back in the future,
[45:23.120 --> 45:25.640]  in a hundred years and see what they've done.
[45:25.640 --> 45:27.120]  See what it looks like.
[45:27.120 --> 45:30.680]  And then you could be proud that you started the journey.
[45:30.680 --> 45:31.520]  Yeah.
[45:31.520 --> 45:32.360]  And I'm with you, Bob.
[45:32.360 --> 45:33.720]  I don't want to grow old.
[45:33.720 --> 45:37.560]  And if I could live to 150 and still have a fulfilling life,
[45:37.560 --> 45:39.080]  I hope that's what happened.
[45:39.240 --> 45:41.640]  They say time flies when you're having fun.
[45:41.640 --> 45:44.000]  I might be having too much fun
[45:44.000 --> 45:48.640]  because I blinked and I was 21 and now I'm over double that.
[45:48.640 --> 45:51.760]  So yeah, I absolutely agree with you.
[45:51.760 --> 45:55.520]  But I'd like to thank you both for being on our show today.
[45:55.520 --> 45:56.360]  Hang on.
[45:56.360 --> 45:57.680]  Does that mean we don't get to find out
[45:57.680 --> 45:58.920]  what you two would do?
[45:58.920 --> 45:59.760]  Yeah.
[45:59.760 --> 46:00.600]  No, no, no, no, no.
[46:00.600 --> 46:01.440]  Oh no.
[46:01.440 --> 46:03.160]  We don't get to say goodbye yet.
[46:03.160 --> 46:06.200]  I want to know, Sally, so are you saying you'd go back
[46:06.200 --> 46:08.440]  and then Mel, what's yours?
[46:08.680 --> 46:09.520]  Oh, I don't know.
[46:09.520 --> 46:10.360]  Do you know what?
[46:10.360 --> 46:11.200]  I always ask it.
[46:11.200 --> 46:13.480]  I've never actually thought about it myself.
[46:13.480 --> 46:15.320]  And no one's asked you.
[46:15.320 --> 46:16.160]  No.
[46:16.160 --> 46:18.240]  That's because you've had the two best people on this show.
[46:18.240 --> 46:19.080]  Just saying.
[46:19.080 --> 46:19.920]  I know.
[46:19.920 --> 46:21.200]  What would I do?
[46:21.200 --> 46:22.040]  I don't know.
[46:22.040 --> 46:24.040]  I'm a little bit scared about the future
[46:24.040 --> 46:26.840]  because you just don't know what's going to happen, do you?
[46:26.840 --> 46:29.000]  So, do you know what?
[46:29.000 --> 46:31.080]  Until you two have said what you've said today,
[46:31.080 --> 46:32.480]  I probably wouldn't even have thought like that.
[46:32.480 --> 46:34.120]  But I actually quite liked that.
[46:34.120 --> 46:36.080]  Whereas I have no regrets in my life,
[46:36.080 --> 46:39.240]  but actually to go back with the knowledge
[46:39.240 --> 46:41.520]  and experience and stuff I've got now,
[46:41.520 --> 46:42.920]  maybe it could just be more.
[46:42.920 --> 46:43.760]  Just more.
[46:43.760 --> 46:44.600]  I don't know.
[46:44.600 --> 46:45.440]  So yeah.
[46:45.440 --> 46:46.280]  Yeah, maybe.
[46:46.280 --> 46:47.280]  I'll have to think more, but yeah.
[46:47.280 --> 46:52.280]  I've just put in a development request for a time machine.
[46:52.360 --> 46:53.200]  Don't worry.
[46:53.200 --> 46:54.040]  We'll get one done.
[46:54.040 --> 46:58.840]  Yeah, so Saduf , I would definitely go back
[46:58.840 --> 47:01.360]  because I'd love my life to be longer,
[47:01.360 --> 47:04.040]  but I'd also love to spend time with friends and family
[47:04.040 --> 47:05.120]  that are no longer with us.
[47:05.120 --> 47:07.800]  So, I'd love to go back and be able to do that.
[47:07.800 --> 47:10.240]  So, yeah, absolutely.
[47:10.240 --> 47:12.520]  Right, now you need to let me do my outro, Saduf.
[47:12.520 --> 47:13.360]  Yes.
[47:13.360 --> 47:14.520]  Yes, all right.
[47:16.160 --> 47:19.000]  Right, so thanks, Saduf  and Bob, for joining us today.
[47:19.000 --> 47:21.520]  I think we've all found out a lot more about you both
[47:21.520 --> 47:24.560]  and the Imaging IT Special Focus Group,
[47:24.560 --> 47:27.480]  which is a really proactive group within Axrem.
[47:27.480 --> 47:29.640]  A big thank you to Bob and Saduf  for joining us
[47:29.640 --> 47:32.080]  and thank you to all of our listeners.
[47:32.080 --> 47:34.960]  During Series 5, we've been showcasing all of Axrem's
[47:34.960 --> 47:37.440]  Special Focus Groups, bringing you all the up-to-date
[47:37.440 --> 47:38.720]  news and information.
[47:38.720 --> 47:41.280]  I personally am very proud of what each and every group
[47:41.280 --> 47:43.520]  has been working on and has achieved.
[47:43.520 --> 47:45.520]  We hope you have all enjoyed hearing more
[47:45.520 --> 47:47.520]  and keep a listen out for after summer
[47:47.520 --> 47:49.880]  for the next edition of Axrem Insights,
[47:49.880 --> 47:51.240]  where we will be having discussions
[47:51.240 --> 47:53.280]  about healthcare hot topics
[47:53.280 --> 47:55.960]  and speaking to key professionals across the sector.
[47:55.960 --> 47:57.280]  See you soon.
[47:57.280 --> 47:58.920]  If you have enjoyed today's podcast,
[47:58.920 --> 48:00.400]  don't forget to hit subscribe
[48:00.400 --> 48:03.000]  or feel free to share the podcast with friends.