
Transform Podcast
The Transparity Transform Podcast brings together thought leaders from across the UK's technology sector to discuss the industry and nation's hottest topics. Brought to you by Transparity, the UK's most accredited pureplay Microsoft Partner.
Transform Podcast
Just Do AI: Everyone wants to ‘do AI’, but how can you make real progress?
Senior leaders are being told ‘we need to do AI’ to keep up with the buzz and potential of new AI capabilities. In competitive markets, or to boost productivity across the board, AI is seen as the answer to a lot of prayers. But, IT leaders are falling into two camps: those that dive in headfirst, and those that get caught in the utopian vision, unable to make progress.
In this episode, we’ll tackle both camps, providing clear actionable priorities to get started immediately, whilst also making sure you’re not ‘just doing AI’ for the buzz of it all. Start making progress today and hear from industry experts on how they’re seeing immediate impact in their AI adoption.
Guests on today's episode are:
Henrique Moniz de Aragão, Head of Data & AI at Microsoft
Simon Willmore, Digital Transformation Director, BMT
Jodie Rodgers, Chief AI Officer at Transparity
Hosted by
James Taylor, Enterprise Account Executive at Transparity.
1
00:00:03.465 --> 00:00:05.955
Welcome to our debut episode of The Transform Podcast.
2
00:00:06.335 --> 00:00:08.835
Uh, we're gonna be diving into a really interesting topic
3
00:00:08.845 --> 00:00:10.715
today that we hear across the industry,
4
00:00:10.855 --> 00:00:14.315
and that is leaders being told to just do ai.
5
00:00:14.585 --> 00:00:17.675
What does that really mean? We're gonna unpack that today.
6
00:00:17.675 --> 00:00:20.475
We've got some fantastic guests with us starting with, uh,
7
00:00:20.475 --> 00:00:24.195
Henrique, who is, uh, head of data and AI at Microsoft uk.
8
00:00:24.215 --> 00:00:26.395
So, Henrique, thanks so much for joining us today.
9
00:00:26.815 --> 00:00:28.635
Um, I wonder if you could just tell us a little bit about
10
00:00:28.785 --> 00:00:30.515
your role in the organization
11
00:00:30.575 --> 00:00:32.195
and just give us a bit of background. Cool.
12
00:00:32.195 --> 00:00:33.515
Well, jt, it's great to be here
13
00:00:33.575 --> 00:00:35.795
and, um, feel honored to be here for the first episode.
14
00:00:36.125 --> 00:00:38.315
Thank you. I, um, yeah, I'm Microsoft.
15
00:00:38.635 --> 00:00:40.435
I have a really great job.
16
00:00:40.615 --> 00:00:42.115
Uh, I get to lead a team
17
00:00:42.115 --> 00:00:45.915
of solutions specialists focused on everything to do
18
00:00:45.915 --> 00:00:47.875
with the data and AI offerings of Microsoft.
19
00:00:47.895 --> 00:00:50.235
So everything from databases, analytics
20
00:00:50.855 --> 00:00:54.355
and ai, including Azure AI services, Azure Open ai.
21
00:00:54.355 --> 00:00:58.555
So, uh, for the uninitiated, I like to say that we work
22
00:00:58.555 --> 00:01:00.955
with customers that are at the forefront of really some
23
00:01:00.955 --> 00:01:03.395
of the most innovative use cases around this technology.
24
00:01:03.935 --> 00:01:06.235
Uh, a lot of it very customized as well,
25
00:01:06.735 --> 00:01:08.915
but, uh, we get to see a lot of what's coming,
26
00:01:08.935 --> 00:01:09.955
and that's a real joy.
27
00:01:10.615 --> 00:01:13.595
Uh, I've been at Microsoft, uh, just over a year and a half
28
00:01:14.375 --> 00:01:18.555
and spent all of my career in technology roles, uh, half
29
00:01:18.555 --> 00:01:20.955
of my career actually in consulting, like, like you guys.
30
00:01:21.095 --> 00:01:23.715
And that was a great foundation to really understand how
31
00:01:23.715 --> 00:01:25.835
to transcribe problems into solutions.
32
00:01:26.375 --> 00:01:29.275
Uh, but the other half in a number of companies, both, uh,
33
00:01:29.295 --> 00:01:32.635
global, uh, technology businesses as well as startups.
34
00:01:33.015 --> 00:01:35.755
Uh, but this by far has been some
35
00:01:35.755 --> 00:01:36.915
of the steepest learning curves
36
00:01:36.915 --> 00:01:38.155
for me in my professional career.
37
00:01:38.625 --> 00:01:40.755
Amazing. So you're really well placed then, based on
38
00:01:40.755 --> 00:01:42.795
that background to help customers lead their
39
00:01:42.935 --> 00:01:43.955
AI journey, I suppose.
40
00:01:44.415 --> 00:01:47.155
Um, for those that are just starting out on their journey,
41
00:01:47.425 --> 00:01:49.875
what are some of the, the sort of successes
42
00:01:49.875 --> 00:01:51.795
and failures that you're seeing in the space today?
43
00:01:52.015 --> 00:01:53.795
What's making people successful?
44
00:01:54.945 --> 00:01:56.915
Yeah, look, I think, uh, we're going
45
00:01:56.915 --> 00:01:58.075
through a really interesting time
46
00:01:58.075 --> 00:02:01.515
because the speed at which this technology is changing
47
00:02:02.175 --> 00:02:05.635
is far beyond our ability, um,
48
00:02:05.735 --> 00:02:08.355
or even our experience in terms of dealing with change.
49
00:02:08.895 --> 00:02:10.835
If you think about it for a second, uh,
50
00:02:10.835 --> 00:02:14.115
if you look at technologies that have taken time to reach,
51
00:02:14.175 --> 00:02:16.715
say, a hundred million people, uh, using it
52
00:02:16.715 --> 00:02:19.435
to take the internet, for example, that took seven years
53
00:02:19.495 --> 00:02:20.755
to reach a hundred million people.
54
00:02:21.495 --> 00:02:23.515
Uh, chat, GPT took 60 days.
55
00:02:24.345 --> 00:02:27.045
Uh, and so when I think back to the last,
56
00:02:27.045 --> 00:02:28.965
even the last six months, you know,
57
00:02:29.425 --> 00:02:31.925
it feels like AI technologies progress
58
00:02:31.945 --> 00:02:33.565
and kind of like dog's ears, right?
59
00:02:33.835 --> 00:02:36.645
It's like a, you know, two months is like a year of worth
60
00:02:36.645 --> 00:02:38.045
of a of, of, uh, of changes.
61
00:02:38.585 --> 00:02:41.485
And what I think is happening now is, you know, with, uh,
62
00:02:41.585 --> 00:02:43.805
the advances in capabilities for these models
63
00:02:43.805 --> 00:02:46.885
and these platforms to be able to do things like a agent ai,
64
00:02:46.885 --> 00:02:49.365
where the technology can not only give you information
65
00:02:49.365 --> 00:02:51.005
but actually take action, uh,
66
00:02:51.125 --> 00:02:53.405
I think it really puts the focus back on businesses
67
00:02:54.025 --> 00:02:57.005
to think about the business opportunities rather
68
00:02:57.005 --> 00:02:58.125
than the technology itself.
69
00:02:58.825 --> 00:03:02.565
Um, if you think about other significant changes, uh,
70
00:03:02.865 --> 00:03:05.765
in technological paradigms in the past, in fact,
71
00:03:05.765 --> 00:03:07.165
not even technological paradigms.
72
00:03:07.285 --> 00:03:09.525
'cause I always talk about how we're past the picks
73
00:03:09.525 --> 00:03:10.645
and shovels phase of ai.
74
00:03:10.665 --> 00:03:12.805
Now, you know, a lot of the value in AI has been given
75
00:03:12.825 --> 00:03:14.845
to companies that produce models, companies
76
00:03:14.845 --> 00:03:16.645
that produce chips, or companies
77
00:03:16.645 --> 00:03:18.005
that produce the AI infrastructure.
78
00:03:18.585 --> 00:03:20.645
But what we're seeing now is a shift, um,
79
00:03:20.905 --> 00:03:24.325
to actually value being derived from opportunities
80
00:03:24.335 --> 00:03:26.765
where businesses are putting that technology to use.
81
00:03:27.305 --> 00:03:29.765
Um, and so the focus I think is not about ai.
82
00:03:29.795 --> 00:03:31.885
It's about, um, business outcomes.
83
00:03:31.995 --> 00:03:36.365
It's about, um, building a culture of
84
00:03:37.125 --> 00:03:39.365
adopting, uh, and adapting to change.
85
00:03:39.945 --> 00:03:42.965
Uh, but also it's about a significant shift
86
00:03:43.185 --> 00:03:47.085
of leadership in organizations, whether is strong commitment
87
00:03:47.505 --> 00:03:50.925
to really look at this point in time as the time
88
00:03:50.935 --> 00:03:52.485
where you are gonna be rewriting
89
00:03:52.485 --> 00:03:53.725
your organizational strategy.
90
00:03:55.095 --> 00:03:57.535
Interesting. I mean, at that point that you talk around,
91
00:03:57.755 --> 00:03:59.455
um, strong leadership, um,
92
00:03:59.905 --> 00:04:01.655
we're seeing customers approach AI
93
00:04:01.655 --> 00:04:03.095
and their AI strategy in different ways.
94
00:04:03.475 --> 00:04:05.895
Um, some very, very cautious starting
95
00:04:05.895 --> 00:04:07.655
with a very strategy led approach, others
96
00:04:07.655 --> 00:04:08.815
with quite an iterative one.
97
00:04:08.995 --> 00:04:11.575
Do you think there's a right answer in terms of the approach
98
00:04:11.595 --> 00:04:13.055
to getting started with ai?
99
00:04:13.555 --> 00:04:17.565
Um, I, I, I think I do have a point of view,
100
00:04:17.825 --> 00:04:19.485
and that is that I think
101
00:04:19.485 --> 00:04:21.285
that the top down leadership commitment
102
00:04:21.705 --> 00:04:25.685
and the commitment to, um, innovating and,
103
00:04:25.705 --> 00:04:29.125
and disrupting its own business is absolutely crucial.
104
00:04:29.825 --> 00:04:34.605
Um, I also think that a commitment to building up a culture
105
00:04:34.745 --> 00:04:35.925
of experimentation
106
00:04:36.545 --> 00:04:40.845
and empowering, uh, parts of the organization to be
107
00:04:40.845 --> 00:04:42.965
that connective tissue is also really important.
108
00:04:43.265 --> 00:04:47.045
But when it comes to what do you actually do, uh, I believe
109
00:04:47.045 --> 00:04:49.205
that the best thing to do is to just get started.
110
00:04:49.425 --> 00:04:52.965
And what do I mean by that? I mean, you know, you don't have
111
00:04:52.965 --> 00:04:55.485
to work with tech to be exposed
112
00:04:55.505 --> 00:04:58.885
and already be using these new technologies, right?
113
00:04:59.165 --> 00:05:01.685
I mean, as I said, chat chip two took 60 days
114
00:05:01.685 --> 00:05:02.725
to reach a hundred million users.
115
00:05:02.735 --> 00:05:04.005
We're all using it in our day to day.
116
00:05:04.005 --> 00:05:05.965
And in fact, you know, the studies that we,
117
00:05:06.035 --> 00:05:08.845
that we put out there when we were rolling out Microsoft
118
00:05:08.885 --> 00:05:12.085
Co-Pilot was that at that point, I think the adoption
119
00:05:12.145 --> 00:05:14.845
of Microsoft copilot in the enterprise was about 20%,
120
00:05:15.305 --> 00:05:16.925
but close to two thirds
121
00:05:16.925 --> 00:05:19.005
of all employees were saying they were using generative AI
122
00:05:19.005 --> 00:05:20.845
tools of some sort elsewhere.
123
00:05:21.305 --> 00:05:23.605
So that reinforces the importance of, um,
124
00:05:23.705 --> 00:05:24.925
top down leadership, putting
125
00:05:24.925 --> 00:05:26.085
this technology in the hands of people.
126
00:05:26.545 --> 00:05:29.125
But when it comes to use cases, yes, don't try
127
00:05:29.125 --> 00:05:30.205
and boil the, boil the ocean.
128
00:05:30.705 --> 00:05:31.965
Um, pick one
129
00:05:31.965 --> 00:05:36.285
or two key business needs that are perhaps not being met
130
00:05:36.625 --> 00:05:39.885
or being met less than what you would expect it to be.
131
00:05:40.465 --> 00:05:43.405
Um, be very clear about what are the business outcomes
132
00:05:43.405 --> 00:05:45.845
that you would like to see different, um,
133
00:05:46.305 --> 00:05:49.245
get the right people and with the right mindset involved,
134
00:05:49.795 --> 00:05:51.045
only then look at the technology,
135
00:05:51.115 --> 00:05:52.965
only then look at the models, look at the platform.
136
00:05:53.265 --> 00:05:55.165
Um, but I would always say start with one
137
00:05:55.165 --> 00:05:59.485
or two very with very, very clear, uh, guardrails around,
138
00:05:59.625 --> 00:06:01.925
um, KPIs for the business outcome
139
00:06:02.385 --> 00:06:03.925
and just, just get those done.
140
00:06:04.715 --> 00:06:06.805
Yeah, it's interesting. I mean, when we talk to a lot
141
00:06:06.805 --> 00:06:09.045
of our customers, it feels like sometimes you have a chat
142
00:06:09.045 --> 00:06:11.045
with them and, and their use cases actually just leap off
143
00:06:11.045 --> 00:06:12.685
the page immediately in an early conversation.
144
00:06:13.265 --> 00:06:15.605
Are there any particular user stories, you know,
145
00:06:15.605 --> 00:06:18.165
if we put the, the sort of black box of AI tech to one side
146
00:06:18.165 --> 00:06:19.565
for a second, there any particular use cases
147
00:06:19.595 --> 00:06:22.125
that have really stood out to you or, or,
148
00:06:22.345 --> 00:06:23.805
or things that customers have mentioned to you
149
00:06:23.805 --> 00:06:25.245
that have been significantly impacted
150
00:06:25.585 --> 00:06:26.685
to an end user perspective?
151
00:06:26.685 --> 00:06:28.285
Taking away from what leadership see?
152
00:06:29.345 --> 00:06:31.085
Um, well, I think the personal stories
153
00:06:31.145 --> 00:06:32.365
are always the best, right?
154
00:06:33.385 --> 00:06:37.355
Uh, it's, we, we really worked strongly
155
00:06:37.575 --> 00:06:40.915
to ensure that we were getting AI in the hands of everyone.
156
00:06:40.935 --> 00:06:42.395
And the easiest way to do that is just
157
00:06:42.395 --> 00:06:43.515
to take an off the shelf tool
158
00:06:43.655 --> 00:06:45.355
that's focused on productivity gains.
159
00:06:45.735 --> 00:06:47.475
But genuinely, you know, I,
160
00:06:47.615 --> 00:06:50.075
and I'll, I'll speak from a personal experience, you know,
161
00:06:50.455 --> 00:06:53.875
uh, I work in a very complex, highly sized organization.
162
00:06:54.535 --> 00:06:58.875
Um, I, I don't think I've ever experienced these many emails
163
00:06:59.015 --> 00:07:00.915
and messages, you know, being thrown at me.
164
00:07:01.095 --> 00:07:03.355
And imagine if you take a week off of work, you know,
165
00:07:03.355 --> 00:07:06.595
the ability to be able to get back on top, um, within hours,
166
00:07:07.175 --> 00:07:10.715
um, or even, you know, today I have agents
167
00:07:10.715 --> 00:07:12.075
that I use, right?
168
00:07:12.135 --> 00:07:15.875
So I don't just have copilot to help me do a better job.
169
00:07:16.475 --> 00:07:20.235
I also have digital colleagues that do work on my behalf.
170
00:07:20.295 --> 00:07:22.995
And I'm talking about, you know, plowing
171
00:07:22.995 --> 00:07:25.835
through spreadsheets and figuring out, um,
172
00:07:26.255 --> 00:07:28.955
how we're trending over a period of time on some
173
00:07:28.955 --> 00:07:30.235
of our key performance indicators.
174
00:07:30.375 --> 00:07:34.115
I'm talking about, um, using researcher agents
175
00:07:34.545 --> 00:07:36.595
that would do the work that would normally would take me
176
00:07:36.915 --> 00:07:40.755
probably three afternoons in about 10 minutes to prepare
177
00:07:40.815 --> 00:07:44.995
for a customer meeting or, uh, to do some market analysis
178
00:07:44.995 --> 00:07:47.395
or even to give me a head start and go to market planning.
179
00:07:47.455 --> 00:07:49.555
So, you know, I think most of the stories
180
00:07:49.555 --> 00:07:52.355
that I'm hearing on a personal level, uh, tend to be
181
00:07:52.355 --> 00:07:54.355
around just that increase in productivity
182
00:07:54.375 --> 00:07:55.715
and impact that you can drive.
183
00:07:56.335 --> 00:07:58.555
Um, but that's only really just scratching the surface
184
00:07:58.695 --> 00:08:01.995
before you start to look at, um, beyond the user.
185
00:08:02.025 --> 00:08:04.115
When you start to look at the impact on business processes
186
00:08:04.115 --> 00:08:05.235
and some of the examples we're seeing there,
187
00:08:06.105 --> 00:08:07.605
That's timely day to day-to-day benefit.
188
00:08:07.705 --> 00:08:09.325
I'm back from a week of leave today,
189
00:08:09.345 --> 00:08:10.925
and it helped me get caught up this morning
190
00:08:11.025 --> 00:08:12.885
so I could get back on with what we're doing today.
191
00:08:12.945 --> 00:08:14.645
So just hopping back slightly, you said
192
00:08:14.645 --> 00:08:16.605
around strong leadership earlier, um, one
193
00:08:16.605 --> 00:08:18.325
of the things I suppose I'm interested in is we see a lot
194
00:08:18.325 --> 00:08:21.765
of success stories around AI from new organizations, people
195
00:08:21.765 --> 00:08:24.245
that are able to adopt AI from a greenfield perspective.
196
00:08:24.825 --> 00:08:26.725
If you're a longer standing organization
197
00:08:26.785 --> 00:08:29.925
and you are looking to embed more of that culture of, um,
198
00:08:30.025 --> 00:08:33.405
you know, rapid iteration to adopt ai, are there any kind
199
00:08:33.405 --> 00:08:35.485
of tips that you'd give to more entrenched leaders
200
00:08:35.745 --> 00:08:38.285
to really foster that culture of, of, uh,
201
00:08:38.345 --> 00:08:39.485
of, of rapid iteration?
202
00:08:40.585 --> 00:08:44.325
Um, well, look, we, as a business at Microsoft, you know,
203
00:08:44.325 --> 00:08:46.285
we're trying to lead in this age of AI
204
00:08:46.385 --> 00:08:48.645
to really put this technology in the hands of
205
00:08:48.645 --> 00:08:49.765
as many people as possible.
206
00:08:50.305 --> 00:08:53.285
Uh, you know, we have a strong belief that diffusion
207
00:08:53.305 --> 00:08:56.205
of this technology is what's gonna drive the kind
208
00:08:56.205 --> 00:08:58.165
of impact that is promised.
209
00:08:58.545 --> 00:08:59.845
And if nobody's gonna use it,
210
00:08:59.915 --> 00:09:01.365
then what's the point of all of this?
211
00:09:01.785 --> 00:09:04.565
But one thing that I am, um, uh,
212
00:09:04.625 --> 00:09:06.965
always highlighting is two things that, uh,
213
00:09:07.045 --> 00:09:10.485
I think leadership need to really, uh, take charge on.
214
00:09:11.235 --> 00:09:15.895
Uh, it's very easy to just delegate, uh, let's say to a team
215
00:09:15.955 --> 00:09:18.895
or to it, go figure out what AI can do for us, right?
216
00:09:19.515 --> 00:09:21.855
Um, but it's actually two things
217
00:09:21.855 --> 00:09:23.655
that I think I always emphasize and,
218
00:09:23.655 --> 00:09:25.815
and I see it in organizations that are doing really well
219
00:09:25.915 --> 00:09:27.375
and sort of a bit of ahead of the pack.
220
00:09:27.635 --> 00:09:30.975
Number one is a commitment, uh,
221
00:09:31.445 --> 00:09:32.655
with actual follow
222
00:09:32.655 --> 00:09:35.815
through action on the responsible use of ai.
223
00:09:35.815 --> 00:09:39.335
Yeah. Uh, why, um, for a couple of reasons.
224
00:09:39.335 --> 00:09:43.415
Number one, it helps build trust with, uh, employees,
225
00:09:43.875 --> 00:09:45.895
but also with external stakeholders
226
00:09:45.895 --> 00:09:46.975
that interact with your business.
227
00:09:47.355 --> 00:09:50.255
So absolute clarity on what's your stance on the responsible
228
00:09:50.255 --> 00:09:51.615
use of ai, what are you doing?
229
00:09:51.965 --> 00:09:54.775
What tools are you using to ensure that there are
230
00:09:55.455 --> 00:09:58.815
safeguards, um, and guardrails around what you will
231
00:09:58.815 --> 00:09:59.895
and won't do with this technology,
232
00:10:00.235 --> 00:10:03.375
but also when you're using it, um, the transparency behind
233
00:10:03.375 --> 00:10:05.095
that use that is absolutely key.
234
00:10:05.095 --> 00:10:07.135
And that only comes from top down leadership.
235
00:10:07.135 --> 00:10:10.215
And the second is, especially for, you know,
236
00:10:10.595 --> 00:10:12.495
longer standing organizations.
237
00:10:12.515 --> 00:10:14.535
And look, Microsoft turned 50 this year,
238
00:10:14.875 --> 00:10:17.455
that's a really long time in technological terms, right?
239
00:10:17.835 --> 00:10:19.935
Uh, and it's security, right?
240
00:10:20.035 --> 00:10:24.495
So, you know, the, the, the, not just a secure use of, of,
241
00:10:24.495 --> 00:10:28.015
of ai, but also it's the opportunity for you to uncover
242
00:10:28.275 --> 00:10:32.935
and address all of the data governance, um, uh, challenges
243
00:10:32.935 --> 00:10:34.255
that you might have in your organization.
244
00:10:34.255 --> 00:10:37.435
Because if you can't bring your data, you can't make, uh,
245
00:10:37.545 --> 00:10:39.555
safe use and secure use of this technology.
246
00:10:40.215 --> 00:10:41.755
And so I, I'd say trust
247
00:10:41.755 --> 00:10:44.995
and security are the two things that, for organizations
248
00:10:44.995 --> 00:10:46.755
that are a bit more long longstanding
249
00:10:46.775 --> 00:10:48.795
and aren't building from the ground up, uh,
250
00:10:48.895 --> 00:10:50.195
we really try and emphasize,
251
00:10:51.715 --> 00:10:52.715
It's interesting. And the,
252
00:10:52.715 --> 00:10:54.885
the sort of cultural changes that I saw, you know,
253
00:10:55.015 --> 00:10:57.885
Satya mentioned recently that, um, you know, the, the,
254
00:10:57.885 --> 00:11:00.085
the change that AI is going to have on the industry
255
00:11:00.085 --> 00:11:03.285
and on the economy and the globe at scale is, uh, something
256
00:11:03.285 --> 00:11:04.965
that we've probably only seen previously from
257
00:11:05.305 --> 00:11:06.445
the industrial Revolution.
258
00:11:06.865 --> 00:11:08.565
Um, I suppose if we jump forward a bit,
259
00:11:08.565 --> 00:11:10.565
if you put your futurist hat on for a second, you know,
260
00:11:10.565 --> 00:11:14.005
where do you see, uh, the world, Microsoft, the state
261
00:11:14.005 --> 00:11:15.845
of AI in say, 12 to 24 months from now?
262
00:11:16.005 --> 00:11:17.005
'cause we talked already about how
263
00:11:17.005 --> 00:11:18.205
fast this industry's moving.
264
00:11:18.285 --> 00:11:19.845
I mean, it's, we've never seen anything like this before.
265
00:11:20.075 --> 00:11:24.445
Yeah, and I think, um, I think it was a Sam Altman essay
266
00:11:24.525 --> 00:11:25.765
that I was reading back,
267
00:11:25.825 --> 00:11:30.125
and he said something like, uh, we are close
268
00:11:31.105 --> 00:11:33.365
to super intelligence.
269
00:11:34.275 --> 00:11:37.225
Okay, uh, super intelligence is, is a state
270
00:11:37.225 --> 00:11:38.185
that supposedly is beyond
271
00:11:38.545 --> 00:11:39.785
artificial general intelligence, by the way.
272
00:11:40.245 --> 00:11:42.425
He said, we're, we, we're close to super intelligence.
273
00:11:42.545 --> 00:11:44.425
I just don't know if we're ahead of it behind it.
274
00:11:44.965 --> 00:11:47.465
Um, and, and I think that struck with, what that struck
275
00:11:47.465 --> 00:11:49.225
with me was that the capabilities
276
00:11:49.245 --> 00:11:50.665
of the technology are actually more,
277
00:11:50.845 --> 00:11:52.185
way more advanced than we're actually
278
00:11:52.185 --> 00:11:53.405
making use of it right now.
279
00:11:53.785 --> 00:11:54.965
And I think that, um,
280
00:11:55.285 --> 00:11:57.245
I don't think it's just gonna creep up on us one day
281
00:11:57.245 --> 00:11:58.765
and sort of punch us in the teeth,
282
00:11:58.905 --> 00:12:00.365
but we're just gonna wake up one day
283
00:12:00.365 --> 00:12:03.045
and realize that it's doing a lot of stuff for us, a lot
284
00:12:03.045 --> 00:12:05.845
of things that humans used to do, but much better.
285
00:12:06.745 --> 00:12:10.605
Um, I do have strong conviction that this is, uh, a, a, uh,
286
00:12:10.625 --> 00:12:13.845
the greatest general purpose technology that we've ever had.
287
00:12:14.345 --> 00:12:17.365
Uh, if you think back to things like the printing press,
288
00:12:18.005 --> 00:12:20.245
electricity, the internet, I don't think
289
00:12:20.245 --> 00:12:22.245
that any other technology will have had this much
290
00:12:22.245 --> 00:12:25.485
of an impact on, um, on
291
00:12:25.485 --> 00:12:27.285
where humanity is going or where our world is going.
292
00:12:27.825 --> 00:12:29.325
Uh, and there's big, big risks
293
00:12:29.325 --> 00:12:30.525
with it, but also great promise.
294
00:12:31.385 --> 00:12:34.725
Um, the latest study that we looked at from the IDC in the
295
00:12:34.865 --> 00:12:37.325
UK alone predicted that by 2030,
296
00:12:37.855 --> 00:12:39.925
these technological advances would add something like
297
00:12:39.925 --> 00:12:43.205
500 billion, um, uh, pounds, the to GDP.
298
00:12:44.175 --> 00:12:45.435
Um, and, um,
299
00:12:45.855 --> 00:12:49.155
and I think a few years from now, the way that we interact
300
00:12:49.545 --> 00:12:51.235
with technology will be completely different.
301
00:12:51.775 --> 00:12:55.355
Um, my 4-year-old daughter will not use the internet in any
302
00:12:55.355 --> 00:12:56.715
way, um, like we do.
303
00:12:57.335 --> 00:13:01.995
Um, I think that if you can just ask AI to do everything
304
00:13:01.995 --> 00:13:03.795
that you would do in other systems,
305
00:13:04.775 --> 00:13:06.995
why would you ever log into any of those applications again?
306
00:13:07.765 --> 00:13:11.515
Right? And I just don't think we're ready for that yet.
307
00:13:12.295 --> 00:13:15.315
And it's gonna be a much different world in terms of
308
00:13:15.615 --> 00:13:19.495
how quickly you can go from, you know, business idea
309
00:13:19.995 --> 00:13:22.495
or social impact idea to outcome.
310
00:13:23.315 --> 00:13:27.335
Uh, and, you know, nobody's sitting around today, um,
311
00:13:27.815 --> 00:13:29.495
building a platform
312
00:13:29.495 --> 00:13:32.175
or technology going, oh, I'm gonna copy exactly what
313
00:13:32.815 --> 00:13:33.895
Microsoft Outlook looks like.
314
00:13:33.945 --> 00:13:34.975
Right? Um, even
315
00:13:34.975 --> 00:13:36.855
that will be a completely different experience
316
00:13:36.855 --> 00:13:37.855
for us in a few years time.
317
00:13:38.735 --> 00:13:39.785
It's interesting, isn't it? I mean,
318
00:13:39.785 --> 00:13:41.825
I suppose there's a message to young people, um,
319
00:13:42.315 --> 00:13:44.145
their job probably doesn't even exist yet
320
00:13:44.145 --> 00:13:45.385
that they're gonna take in the future.
321
00:13:45.765 --> 00:13:49.025
Um, it's a, it's a really interesting place to be. Henrique.
322
00:13:49.025 --> 00:13:51.825
You said earlier around, um, you know, the responsibility
323
00:13:51.825 --> 00:13:53.425
that we have to make sure
324
00:13:53.425 --> 00:13:55.145
that we get these tools into the hands of everybody.
325
00:13:55.345 --> 00:13:57.785
I mean, how does that responsibility sit with, with you,
326
00:13:57.785 --> 00:13:59.385
with Microsoft, with a wider industry?
327
00:13:59.565 --> 00:14:01.545
You mentioned around us probably not being ready
328
00:14:01.545 --> 00:14:03.665
for what's coming, you know, what do we need to do
329
00:14:03.685 --> 00:14:06.945
as technologists to really change that
330
00:14:06.965 --> 00:14:07.985
as quickly as possible?
331
00:14:07.985 --> 00:14:09.265
Because this is a, this is a
332
00:14:09.265 --> 00:14:10.345
freight train, we can't stop, right?
333
00:14:10.405 --> 00:14:11.625
So how do we get ahead of that?
334
00:14:12.015 --> 00:14:14.665
Yeah, look, I think this is such an important question
335
00:14:14.765 --> 00:14:18.265
and point, because what we're talking about in terms
336
00:14:18.265 --> 00:14:23.025
of the future of work impacts all the hundreds
337
00:14:23.025 --> 00:14:26.425
of millions of workers, but also all of the people
338
00:14:26.425 --> 00:14:29.225
that are gonna be of working age a few years from now.
339
00:14:29.965 --> 00:14:33.255
And I think that the concerns
340
00:14:33.255 --> 00:14:35.575
around drop displacement are legitimate.
341
00:14:36.275 --> 00:14:40.535
Um, I think, I know from looking at data from the likes
342
00:14:40.535 --> 00:14:44.015
of the World Economic Forum, that in the next five years
343
00:14:44.935 --> 00:14:46.135
globally, something like 90 million
344
00:14:46.135 --> 00:14:47.455
jobs are gonna be displaced.
345
00:14:48.075 --> 00:14:50.815
Um, but more than 170 million jobs will be created.
346
00:14:51.345 --> 00:14:52.765
So there'll be a net increase in jobs.
347
00:14:53.465 --> 00:14:54.965
The problem is we dunno what those jobs are.
348
00:14:55.985 --> 00:15:00.885
And, um, what we need to do now as organizations, uh,
349
00:15:01.065 --> 00:15:05.605
is we need to put a significant focus on re-skilling, uh,
350
00:15:05.705 --> 00:15:06.925
and AI literacy.
351
00:15:07.705 --> 00:15:09.685
If you look at, uh, the top
352
00:15:10.265 --> 00:15:13.045
job skills on LinkedIn at the moment, number one is, um,
353
00:15:13.465 --> 00:15:17.085
AI skills, but then number four, number 2, 3, 4,
354
00:15:17.085 --> 00:15:20.405
and five are all human skills like adaptability, creativity,
355
00:15:20.405 --> 00:15:21.405
innovative thinking.
356
00:15:21.905 --> 00:15:25.165
Um, and I think the people that are gonna win are people
357
00:15:25.165 --> 00:15:27.765
that compare, um, those AI skills
358
00:15:27.965 --> 00:15:31.565
with these unique human characteristics, uh, to be able
359
00:15:31.585 --> 00:15:32.765
to reshape their jobs.
360
00:15:33.145 --> 00:15:34.765
You know, the industrial revolution didn't
361
00:15:34.765 --> 00:15:36.565
destroy, you know, millions of jobs.
362
00:15:36.665 --> 00:15:37.725
It transformed them. Yeah.
363
00:15:37.725 --> 00:15:39.085
And that's what we're gonna go through as well.
364
00:15:39.745 --> 00:15:43.325
Uh, I think leaders need to, um, listen really carefully
365
00:15:43.385 --> 00:15:44.685
to what government is doing.
366
00:15:45.425 --> 00:15:48.205
So government has made a commitment to include, um,
367
00:15:48.585 --> 00:15:50.165
AI skills in the curriculum
368
00:15:50.945 --> 00:15:54.845
and a pledge, I think, to drive something like 7 million,
369
00:15:55.065 --> 00:15:57.525
um, people's, um, drive skills,
370
00:15:57.745 --> 00:16:00.845
AI skills in 7 million people, uh, we at Microsoft have made
371
00:16:00.845 --> 00:16:02.085
that commitment alongside government.
372
00:16:02.105 --> 00:16:03.405
And just in the last year,
373
00:16:03.405 --> 00:16:05.205
we've trained over 1.2 million people
374
00:16:05.395 --> 00:16:07.885
with AI skills completely for free across the uk.
375
00:16:08.705 --> 00:16:11.565
Uh, and I think that we need to be mindful of that
376
00:16:11.565 --> 00:16:14.085
for the future generations that are coming to the workplace.
377
00:16:14.705 --> 00:16:16.485
You know, the, as I said
378
00:16:16.485 --> 00:16:20.165
before, I think, um, every interface is gonna be agent.
379
00:16:20.805 --> 00:16:22.325
I think that the way that we interact
380
00:16:22.325 --> 00:16:25.325
with technology is gonna be through natural language.
381
00:16:25.985 --> 00:16:28.765
And so we need to really rethink
382
00:16:28.915 --> 00:16:32.085
what work's gonna look like in the next few years.
383
00:16:32.925 --> 00:16:35.845
I also know that something like, um,
384
00:16:36.635 --> 00:16:38.085
something like two
385
00:16:38.085 --> 00:16:40.445
and a half billion people still don't have access
386
00:16:40.445 --> 00:16:41.445
to the internet globally.
387
00:16:42.145 --> 00:16:45.605
Uh, and a lot of those people might be in countries
388
00:16:45.605 --> 00:16:48.645
where they don't have access to reliable internet or,
389
00:16:48.945 --> 00:16:50.765
or even devices to access the internet.
390
00:16:51.145 --> 00:16:54.045
Um, but also a lot of those people are elderly people
391
00:16:54.235 --> 00:16:56.325
that never learn how to use a laptop, right?
392
00:16:56.385 --> 00:16:58.685
Or can't punch numbers into a screen,
393
00:16:58.745 --> 00:17:01.525
but what if they can now interact with your organization,
394
00:17:01.555 --> 00:17:03.445
your brand, or with your technology
395
00:17:03.445 --> 00:17:04.925
through natural language, right?
396
00:17:05.345 --> 00:17:08.685
And so there's a whole host of different demographics, uh,
397
00:17:08.825 --> 00:17:11.605
and also a whole host of different leaders
398
00:17:11.745 --> 00:17:12.805
and different organizations
399
00:17:12.805 --> 00:17:13.885
that need to be driving that charge.
400
00:17:14.395 --> 00:17:15.765
Yeah. It's amazing to see that firsthand.
401
00:17:15.965 --> 00:17:17.925
Actually, we recently, um, launched a case study
402
00:17:17.925 --> 00:17:20.285
where we worked with RNIB, um,
403
00:17:20.585 --> 00:17:21.765
and just seeing firsthand some
404
00:17:21.765 --> 00:17:24.205
of the stories from their users of, um, you know,
405
00:17:24.205 --> 00:17:25.805
real world examples of, um,
406
00:17:25.805 --> 00:17:27.405
remote workers within their organization
407
00:17:27.405 --> 00:17:29.085
that previously couldn't do things
408
00:17:29.085 --> 00:17:32.325
because of their, you know, lack of eyesight, um,
409
00:17:32.625 --> 00:17:34.645
and AI tools basically, uh,
410
00:17:34.645 --> 00:17:35.805
leveling the playing field for them.
411
00:17:35.865 --> 00:17:38.645
So not just people that are maybe, um, you know,
412
00:17:38.645 --> 00:17:40.005
lower skilled or, or older,
413
00:17:40.105 --> 00:17:41.485
but also people with disabilities.
414
00:17:41.485 --> 00:17:42.925
It's, uh, it's really amazing
415
00:17:42.945 --> 00:17:44.885
to see the impact that it's, that it's having.
416
00:17:45.785 --> 00:17:48.325
How can we make sure though that, um, you know,
417
00:17:48.325 --> 00:17:50.400
you talk about those, those those areas of the world where
418
00:17:50.915 --> 00:17:52.685
perhaps it's just not, uh,
419
00:17:52.685 --> 00:17:54.805
readily accessible from a technology perspective.
420
00:17:54.805 --> 00:17:56.245
You know, what commitments has Microsoft
421
00:17:56.245 --> 00:17:58.165
and others made to really try and close that gap?
422
00:17:58.165 --> 00:17:59.525
Because, you know, some of the people
423
00:17:59.525 --> 00:18:01.565
that may solve the world's future problems may sit within
424
00:18:01.565 --> 00:18:02.685
those, those communities, right?
425
00:18:02.755 --> 00:18:04.605
Yeah. I mean, look, um, I can't speak
426
00:18:04.605 --> 00:18:05.925
for every technology company in the world,
427
00:18:06.065 --> 00:18:08.645
but I know that for Microsoft, for us, um,
428
00:18:08.915 --> 00:18:10.925
putting this technology in the hands of as many people
429
00:18:10.925 --> 00:18:13.605
as possible is what we believe is gonna be the great unlock
430
00:18:13.605 --> 00:18:15.405
of building trust and diffusing technology.
431
00:18:15.705 --> 00:18:19.125
You know, we're on track to spend $80 billion this year on
432
00:18:19.225 --> 00:18:22.005
AI infrastructure, and it's not just on infrastructure in,
433
00:18:22.105 --> 00:18:23.645
in, in developed countries.
434
00:18:23.645 --> 00:18:25.565
These are the global south, uh, as well.
435
00:18:25.945 --> 00:18:28.005
So, um, in Africa and South America
436
00:18:28.025 --> 00:18:31.005
and Southeast Asia, uh, we're making significant investments
437
00:18:31.025 --> 00:18:33.765
to enable every part of the world to be able
438
00:18:33.765 --> 00:18:34.965
to have this technology in their hands.
439
00:18:35.545 --> 00:18:37.565
Uh, and, um, you know, we,
440
00:18:37.865 --> 00:18:39.685
we promote openness and partnership, right?
441
00:18:39.705 --> 00:18:43.165
So we open source a lot of our technology, um, we have,
442
00:18:44.685 --> 00:18:48.645
I lose track now, but something like 34,000 different large
443
00:18:48.885 --> 00:18:50.205
language models on our platform.
444
00:18:50.705 --> 00:18:54.725
Uh, and we believe in choice, uh, an option to,
445
00:18:55.265 --> 00:18:57.645
to drive the diffusion of technology to as many people
446
00:18:57.645 --> 00:18:58.885
who can benefit from it as possible.
447
00:18:59.465 --> 00:19:02.405
Um, I think we've learned very harshly
448
00:19:02.405 --> 00:19:04.845
with lessons in the past with things like electricity,
449
00:19:05.265 --> 00:19:07.365
you know, where, you know, if you look at a map
450
00:19:07.365 --> 00:19:11.445
of the world tonight, um, there's, um, you know, there's,
451
00:19:11.445 --> 00:19:12.805
there's millions of people
452
00:19:12.805 --> 00:19:15.365
that still can't just light up a light bulb at night.
453
00:19:15.785 --> 00:19:17.165
And so we need to ensure
454
00:19:17.165 --> 00:19:19.605
that we don't make those same mistakes with ai.
455
00:19:20.385 --> 00:19:24.405
Um, but look, I'll, I'll tell you this, uh, jt I think a lot
456
00:19:24.405 --> 00:19:26.365
of our listeners here are gonna be business leaders.
457
00:19:26.785 --> 00:19:29.565
Um, and I'm gonna say this, I think that the,
458
00:19:29.945 --> 00:19:33.365
the biggest innovators in this post AI age are
459
00:19:33.885 --> 00:19:35.165
probably businesses and organizations
460
00:19:35.165 --> 00:19:36.205
that have nothing to do with ai.
461
00:19:36.435 --> 00:19:39.205
Yeah. Um, but are organizations that can look at this
462
00:19:39.205 --> 00:19:43.365
as an opportunity to, um, really reevaluate, um,
463
00:19:44.065 --> 00:19:45.725
how do they access new markets, right?
464
00:19:46.035 --> 00:19:48.405
Have they perhaps been unable to serve
465
00:19:49.045 --> 00:19:50.285
a underdeveloped market
466
00:19:50.285 --> 00:19:52.605
because of margin constraints, which now they can,
467
00:19:52.995 --> 00:19:54.965
what new products and services can they offer to the market?
468
00:19:55.545 --> 00:19:59.605
And can they even introduce perhaps entirely novel business
469
00:19:59.605 --> 00:20:02.445
models, uh, either as a result of
470
00:20:02.545 --> 00:20:04.685
or empowered by these technologies?
471
00:20:04.925 --> 00:20:07.165
I think if you think of like a lot of
472
00:20:07.165 --> 00:20:08.325
what if questions, yeah.
473
00:20:08.465 --> 00:20:11.005
Um, you're in a position to really be leading, um,
474
00:20:11.225 --> 00:20:12.725
in your market in the next few years.
475
00:20:13.075 --> 00:20:14.045
Yeah. It's interesting. Some
476
00:20:14.045 --> 00:20:15.125
of the use cases we've already seen.
477
00:20:15.245 --> 00:20:17.085
I mean, I work quite heavily with financial services
478
00:20:17.085 --> 00:20:19.405
institutions, and we're seeing things like the opportunity
479
00:20:19.465 --> 00:20:21.805
to bring lower cost services to the market.
480
00:20:21.805 --> 00:20:23.685
Something that you would traditionally not think of
481
00:20:23.685 --> 00:20:25.725
that industry where maybe you'd be thinking all they want
482
00:20:25.725 --> 00:20:27.765
to do is, is create more products, create more margin.
483
00:20:27.765 --> 00:20:29.405
Yeah. But actually they're saying, well, actually,
484
00:20:29.405 --> 00:20:30.845
if we can drive down our cost delivery,
485
00:20:30.865 --> 00:20:33.125
we can offer a more competitive service to, to users.
486
00:20:33.265 --> 00:20:35.925
So there's some real world opportunity there isn't there
487
00:20:35.945 --> 00:20:38.285
for people to completely rethink the way that they,
488
00:20:38.285 --> 00:20:39.365
they deliver their business.
489
00:20:39.625 --> 00:20:41.845
And I think you're spot on with the, with the point around,
490
00:20:42.265 --> 00:20:44.765
you know, actually sometimes being further away from AI
491
00:20:44.765 --> 00:20:47.325
and not looking at the technology first is really key
492
00:20:47.385 --> 00:20:49.285
to getting the most success out of it.
493
00:20:49.365 --> 00:20:51.965
I think we're seeing a lot of organizations where, um,
494
00:20:51.965 --> 00:20:53.925
they're, they're, they're really, they've got some tech
495
00:20:53.925 --> 00:20:54.965
and they're looking for a problem
496
00:20:54.965 --> 00:20:56.125
rather than the other way around.
497
00:20:56.125 --> 00:20:57.805
Right? Which we really need to try and stop people.
498
00:20:57.805 --> 00:20:59.805
And that's one of the main pieces of advice I'd give.
499
00:21:00.265 --> 00:21:02.005
Um, I suppose finally then closing out,
500
00:21:02.005 --> 00:21:03.765
everybody likes a little bit of a takeaway.
501
00:21:03.865 --> 00:21:07.205
So, um, you know, if there's one piece of sort of key advice
502
00:21:07.305 --> 00:21:09.445
or, or something that you're seeing really happen
503
00:21:09.475 --> 00:21:11.605
with certain customers that's really setting them up
504
00:21:11.605 --> 00:21:12.805
for success, what would be your,
505
00:21:12.805 --> 00:21:14.565
your closing statement on how they can get ahead?
506
00:21:15.145 --> 00:21:17.605
Um, look, as I said, this space is moving very quickly.
507
00:21:18.265 --> 00:21:21.325
And so every six months there are significant shifts
508
00:21:21.435 --> 00:21:24.445
that sometimes if you blink, you miss it.
509
00:21:24.945 --> 00:21:28.525
Um, and I think the last 24 months has all been about put AI
510
00:21:28.525 --> 00:21:30.605
in the hands of people, get them familiar with how to, how
511
00:21:30.605 --> 00:21:32.805
to prompt and how to use this new interface.
512
00:21:33.165 --> 00:21:37.325
I would say right now is, um, embrace the opportunity to
513
00:21:37.915 --> 00:21:41.605
empower all of your people to build agents, right?
514
00:21:41.605 --> 00:21:45.525
Because, um, if you think about if there is a three phase,
515
00:21:46.145 --> 00:21:47.685
if there is a three phase journey of
516
00:21:47.685 --> 00:21:49.165
how organizations are being transformed,
517
00:21:49.165 --> 00:21:52.525
and the first phase is every human has a copilot
518
00:21:52.525 --> 00:21:54.485
to make them more efficient and more productive.
519
00:21:54.905 --> 00:21:58.765
The second phase, which we're in right now is humans, um,
520
00:21:58.945 --> 00:22:00.885
seeing digital colleagues join them.
521
00:22:00.955 --> 00:22:03.765
Yeah. Um, and ideally not digital colleagues that get given
522
00:22:03.765 --> 00:22:05.045
to them, but they create themselves.
523
00:22:05.075 --> 00:22:07.845
Yeah. And so, um, you know, embrace the platforms
524
00:22:07.845 --> 00:22:10.965
that allows your people to build agents to,
525
00:22:10.985 --> 00:22:12.085
to do jobs for them.
526
00:22:12.465 --> 00:22:14.325
And if people can start to experience that,
527
00:22:14.675 --> 00:22:19.325
they can achieve more by creating these agents
528
00:22:19.485 --> 00:22:22.685
of action, um, you will get a lot more out
529
00:22:22.685 --> 00:22:26.845
of your organization, um, rather than resistance.
530
00:22:27.505 --> 00:22:30.005
Um, and then the third phase is when these agents become
531
00:22:30.005 --> 00:22:33.045
fully autonomous and you have human led agent operated
532
00:22:33.045 --> 00:22:35.245
organizations, and there's a few organizations are already
533
00:22:35.245 --> 00:22:38.285
there, but the key takeaway right now is, um, put the tools
534
00:22:38.385 --> 00:22:40.845
for, for your people to be able to create agents
535
00:22:40.845 --> 00:22:43.885
that do jobs, um, repetitive jobs more efficiently.
536
00:22:43.885 --> 00:22:46.845
And I think you can already see, um, a big lift in,
537
00:22:46.865 --> 00:22:48.325
um, in productivity.
538
00:22:48.795 --> 00:22:51.205
Amazing. Enrique, thanks so much for your time.
539
00:22:51.205 --> 00:22:52.365
Really lovely to have you here today
540
00:22:52.365 --> 00:22:53.605
for our initial podcast episode
541
00:22:53.665 --> 00:22:54.605
and, uh, look forward to
542
00:22:54.605 --> 00:22:55.605
Karen and working with in the future.
543
00:22:55.705 --> 00:23:00.405
Thanks for having me. A second guest
544
00:23:00.405 --> 00:23:02.445
today is, uh, Simon, who's the head
545
00:23:02.445 --> 00:23:03.845
of digital strategy at VMT.
546
00:23:03.845 --> 00:23:05.405
Simon, thanks so much for coming in today
547
00:23:05.465 --> 00:23:06.645
and talking a little bit about some
548
00:23:06.645 --> 00:23:07.925
of the autumn work we've been doing together.
549
00:23:08.365 --> 00:23:11.085
I suppose before we dig in, um, for those that don't know,
550
00:23:11.165 --> 00:23:13.005
BMT, do you wanna give us a bit of background as to
551
00:23:13.005 --> 00:23:14.205
what the organization is,
552
00:23:14.705 --> 00:23:16.805
and I suppose how AI fits into your role?
553
00:23:17.995 --> 00:23:20.135
Yes. Well, well, thank you for, uh, thank you
554
00:23:20.135 --> 00:23:21.415
for inviting me here today.
555
00:23:21.915 --> 00:23:26.575
So BMTs a global mid-size enterprise.
556
00:23:26.675 --> 00:23:29.295
We do lots of different things from management consultancy
557
00:23:29.395 --> 00:23:32.055
to engineering services, including designing ships.
558
00:23:32.675 --> 00:23:35.655
Um, other parts of the world have got asset monitoring
559
00:23:35.655 --> 00:23:38.175
equipment on offshore oil platforms in the Gulf of Mexico
560
00:23:38.325 --> 00:23:41.575
through to environmental, um, science
561
00:23:41.635 --> 00:23:43.295
and services in Australia doing
562
00:23:43.495 --> 00:23:44.575
a whole range of different things.
563
00:23:45.435 --> 00:23:48.905
And so as a knowledge based business, um,
564
00:23:49.325 --> 00:23:50.905
AI is quite fundamental to
565
00:23:50.905 --> 00:23:53.065
how we think things are gonna move forward in the industry,
566
00:23:53.065 --> 00:23:57.185
particularly with how, um, maybe creating new content,
567
00:23:57.705 --> 00:24:00.225
summarizing content, understanding, uh,
568
00:24:00.225 --> 00:24:01.785
what our customer challenges are
569
00:24:01.885 --> 00:24:05.385
and how we can do more together, um,
570
00:24:05.725 --> 00:24:06.865
is really, really key to us.
571
00:24:07.085 --> 00:24:10.945
And so, you know, AI as I look forward in my role, is kind
572
00:24:10.945 --> 00:24:12.905
of what's defining defined the last couple of years,
573
00:24:12.925 --> 00:24:14.985
and it's certainly gonna define the next few years as well.
574
00:24:15.515 --> 00:24:17.285
Certainly is. So, I suppose talk
575
00:24:17.285 --> 00:24:19.125
to a little bit about the, the topic that a lot
576
00:24:19.125 --> 00:24:20.925
of leaders want to know at the moment is sort
577
00:24:20.925 --> 00:24:22.405
of getting started, I suppose.
578
00:24:22.425 --> 00:24:24.685
So obviously lots of people have made progress with ai,
579
00:24:24.705 --> 00:24:28.445
but what was the, the first foray for BMT into AI and,
580
00:24:28.445 --> 00:24:31.125
and certainly from, from an end user impact perspective,
581
00:24:31.185 --> 00:24:32.845
you know, what's been the journey to date?
582
00:24:35.145 --> 00:24:38.525
So we started off with lots of enthusiasm.
583
00:24:38.705 --> 00:24:41.525
Uh, I, I recall, um, few months
584
00:24:41.535 --> 00:24:44.005
after chat, GPT had sort of gone
585
00:24:44.185 --> 00:24:46.045
and there were, there were people buzzing.
586
00:24:46.195 --> 00:24:47.925
Yeah. Really real loads
587
00:24:47.925 --> 00:24:49.485
and loads of excitement about this is gonna
588
00:24:49.485 --> 00:24:50.765
completely change the world.
589
00:24:50.765 --> 00:24:53.275
And I remember sitting in front of my browser
590
00:24:53.375 --> 00:24:55.115
for the first time thinking, this is great.
591
00:24:56.155 --> 00:24:57.595
I have no idea what we're gonna use it for.
592
00:24:57.955 --> 00:25:02.355
I, i, it was so hard to see where we, you know,
593
00:25:02.355 --> 00:25:04.955
where we were then to where we were, where we were gonna be
594
00:25:04.955 --> 00:25:06.395
as a business with this, this tool.
595
00:25:07.135 --> 00:25:10.915
Um, so we actually started looking at what we wouldn't do,
596
00:25:10.915 --> 00:25:12.555
where our red lines were gonna be.
597
00:25:12.735 --> 00:25:15.835
And we identified, uh, really, really early on that
598
00:25:16.635 --> 00:25:19.035
actually some aspects around, um, security
599
00:25:19.295 --> 00:25:21.195
and particularly kind of intellectual property,
600
00:25:21.195 --> 00:25:23.955
whether it was our information, our customer's information,
601
00:25:23.975 --> 00:25:26.115
our supplier's information meant
602
00:25:26.115 --> 00:25:27.915
that actually there was some pretty hard boundaries
603
00:25:27.975 --> 00:25:31.515
to things that, um, if we were gonna use this technology
604
00:25:31.515 --> 00:25:33.235
that we'd have to, we'd have to worry about.
605
00:25:33.455 --> 00:25:35.795
So we actually started off with what we weren't gonna do
606
00:25:35.795 --> 00:25:40.505
with it, um, very quickly that then moved on
607
00:25:40.505 --> 00:25:42.785
to thinking a bit more positively about, well,
608
00:25:42.815 --> 00:25:44.225
what could we do there?
609
00:25:44.245 --> 00:25:46.265
How would we start to look at use cases?
610
00:25:47.045 --> 00:25:48.865
And then we encountered the next kind of big problem,
611
00:25:48.915 --> 00:25:51.185
which was, how are you gonna measure value Yeah.
612
00:25:51.185 --> 00:25:54.425
Outta any of these things. Um, what is the business case
613
00:25:54.445 --> 00:25:56.185
behind, um,
614
00:25:57.575 --> 00:25:59.795
behind generative AI particularly?
615
00:26:00.795 --> 00:26:03.215
And, um, that took quite a long time to get, to get,
616
00:26:03.315 --> 00:26:04.495
to get going and get started.
617
00:26:05.335 --> 00:26:06.635
And I suppose, you know, for those
618
00:26:06.635 --> 00:26:08.035
that are listening along, um,
619
00:26:08.735 --> 00:26:10.155
how did you actually go about that in the end?
620
00:26:10.155 --> 00:26:11.475
You know, what were the, what were some of the, the tips
621
00:26:11.495 --> 00:26:13.995
and tricks that you found to establishing business value?
622
00:26:14.075 --> 00:26:16.235
I mean, it's, it's obviously really the, the start
623
00:26:16.235 --> 00:26:18.955
of anybody's AI journey is, is getting that buy-in from,
624
00:26:19.265 --> 00:26:20.635
from boards, from users, and,
625
00:26:20.635 --> 00:26:22.395
and the best place for people to start is showing
626
00:26:22.395 --> 00:26:23.755
how it's gonna actually benefit the business.
627
00:26:24.935 --> 00:26:27.795
Yes. And so we, we, we started, right, again, sort
628
00:26:27.795 --> 00:26:30.115
of at the beginning, what was the, what was the business,
629
00:26:30.415 --> 00:26:32.675
uh, what was the business strategy overall, uh,
630
00:26:32.675 --> 00:26:34.635
what were our, what were our customers going?
631
00:26:35.295 --> 00:26:38.235
But actually we'd, we'd been through many years
632
00:26:38.255 --> 00:26:40.555
of transformation programs within the business
633
00:26:41.375 --> 00:26:43.245
and all the way, all the way along.
634
00:26:43.305 --> 00:26:46.365
We were finding that if you went for a strict kind
635
00:26:46.365 --> 00:26:49.325
of benefits, uh, management approach, uh,
636
00:26:49.325 --> 00:26:52.365
whilst it worked really well in really big organizations in
637
00:26:52.365 --> 00:26:54.285
a mid-sized business like us, we were starting
638
00:26:54.285 --> 00:26:57.485
to gravitate more towards finding six
639
00:26:57.545 --> 00:27:01.085
or seven kind of key outcome driven aspects of the business,
640
00:27:01.085 --> 00:27:03.125
whether that was how it was gonna support our growth,
641
00:27:03.225 --> 00:27:06.485
how it was gonna support our, improving our profitability,
642
00:27:06.625 --> 00:27:08.525
how we were gonna improve customer engagement.
643
00:27:09.145 --> 00:27:13.925
And so actually once we started off with defining actually
644
00:27:13.925 --> 00:27:15.685
what was important to us in those kind of seven
645
00:27:15.685 --> 00:27:18.445
or eight key areas, it was then a bit easier than
646
00:27:18.445 --> 00:27:20.725
to start thinking about, okay, well what does AI mean?
647
00:27:20.725 --> 00:27:22.965
What does AI mean in the employee story?
648
00:27:23.025 --> 00:27:25.205
You know, in, in their journey from joining the business
649
00:27:25.305 --> 00:27:27.725
to being performance managed to leaving the business
650
00:27:27.745 --> 00:27:29.045
or where, where, where would
651
00:27:29.045 --> 00:27:30.125
it, where would it start to help?
652
00:27:30.125 --> 00:27:31.925
And we, we started to find that there were
653
00:27:32.845 --> 00:27:37.685
some kinda quite high level, um, use cases that,
654
00:27:37.785 --> 00:27:41.025
um, that thankfully, you know, aligned quite well with
655
00:27:41.025 --> 00:27:43.425
what generative AI would be quite, quite useful for.
656
00:27:44.005 --> 00:27:45.545
Um, so we started off like that,
657
00:27:45.725 --> 00:27:48.465
but then recognized that actually doing some pilots,
658
00:27:48.465 --> 00:27:51.105
putting some technology in people's hands was,
659
00:27:51.365 --> 00:27:53.865
was absolutely key then to sort of test out the idea, like,
660
00:27:53.865 --> 00:27:55.665
yes, you thought this was gonna be a good idea, yeah,
661
00:27:55.725 --> 00:27:57.305
but did it actually make a difference
662
00:27:57.385 --> 00:28:00.305
or has it just introduced another thing for you to do on top
663
00:28:00.305 --> 00:28:02.345
of not getting the answers to your questions?
664
00:28:03.265 --> 00:28:05.745
Absolutely. Um, I suppose when getting those use cases
665
00:28:05.805 --> 00:28:07.145
as well, how did you find the dynamic?
666
00:28:07.245 --> 00:28:08.865
You know, we hear from a lot of, uh, uh, clients
667
00:28:08.865 --> 00:28:11.465
that we work with that, um, you know, the difference
668
00:28:11.465 --> 00:28:12.705
of opinion between say, leadership
669
00:28:12.705 --> 00:28:14.625
and those on the ground in terms of AI use cases.
670
00:28:14.655 --> 00:28:16.665
When you were pulling those together, how did you make sure
671
00:28:16.665 --> 00:28:19.145
that you got the right voices in the room, um,
672
00:28:19.205 --> 00:28:21.785
and make sure that those use cases were valuable, you know,
673
00:28:21.785 --> 00:28:23.865
to the, to the bottom line, to the business's progress
674
00:28:23.865 --> 00:28:26.785
and its future, but also actually solving real world
675
00:28:26.785 --> 00:28:28.185
problems in the hands of users,
676
00:28:28.285 --> 00:28:29.505
you know, how did you get that balance right?
677
00:28:30.145 --> 00:28:31.755
Yeah. Well, I think it's a,
678
00:28:32.675 --> 00:28:35.055
I think it's a challenge in whatever program you're doing.
679
00:28:35.095 --> 00:28:37.175
This isn't, this isn't unique to AI at all.
680
00:28:37.475 --> 00:28:41.645
Um, from a, from a top down perspective,
681
00:28:42.045 --> 00:28:43.205
I think it depends on sort of who,
682
00:28:43.305 --> 00:28:44.725
who you have sort of in the room.
683
00:28:45.065 --> 00:28:49.885
Um, we as a business have got a very good leadership team,
684
00:28:50.185 --> 00:28:52.045
um, globally who are,
685
00:28:52.145 --> 00:28:53.645
who are very good at the business perspective,
686
00:28:53.645 --> 00:28:56.005
very good at finding the customer opportunity, kind
687
00:28:56.005 --> 00:28:57.005
of putting the customer in the room
688
00:28:57.065 --> 00:28:58.245
is, is really, really key.
689
00:28:58.755 --> 00:29:02.765
When it comes to technology though, um, it's one
690
00:29:02.765 --> 00:29:04.125
of the more challenging areas.
691
00:29:04.305 --> 00:29:05.365
So historically, the,
692
00:29:05.425 --> 00:29:07.005
the business has been around for a long time.
693
00:29:07.125 --> 00:29:09.085
A lot of our working practices are very established.
694
00:29:09.085 --> 00:29:11.685
And so they look actually quite often out into the staff
695
00:29:11.835 --> 00:29:13.405
into roles site mine
696
00:29:13.405 --> 00:29:15.165
and other people's roles to kind of lead,
697
00:29:15.235 --> 00:29:16.365
lead the charge as that were.
698
00:29:16.625 --> 00:29:19.525
So actually they're very receptive to listening to, well,
699
00:29:19.555 --> 00:29:20.685
what is gonna make a difference?
700
00:29:20.685 --> 00:29:23.565
Know, is an employee chat bot, is that the way we want to?
701
00:29:23.565 --> 00:29:25.325
Is that the way we want to go? Do we want to look at
702
00:29:25.465 --> 00:29:26.885
how we support the bidding process
703
00:29:27.185 --> 00:29:29.245
or how we do an aspect of, say,
704
00:29:29.555 --> 00:29:31.365
writing requirements for a customer?
705
00:29:32.065 --> 00:29:33.125
Um, and so actually
706
00:29:33.125 --> 00:29:35.525
what our leadership did is they looked at it more upon,
707
00:29:35.875 --> 00:29:38.485
well, how do we lead and how we enable the kind
708
00:29:38.485 --> 00:29:40.765
of the crowdsourcing of ideas from within the business
709
00:29:41.185 --> 00:29:44.845
to happen in a, um, safe space where they're supported
710
00:29:44.845 --> 00:29:46.845
to get on to, to look at technology.
711
00:29:47.785 --> 00:29:50.245
So you touched a little bit there on knowledge management.
712
00:29:50.245 --> 00:29:53.005
Clearly there's lots of information stored, there's a lot
713
00:29:53.005 --> 00:29:54.165
of value to that information.
714
00:29:54.545 --> 00:29:57.005
How have you leveraged AI to take advantage of that?
715
00:29:57.925 --> 00:30:00.725
Starting off we wanted to find out
716
00:30:01.095 --> 00:30:03.045
where would AI be really useful
717
00:30:03.145 --> 00:30:05.365
before we tackled the really big
718
00:30:05.365 --> 00:30:06.685
and daunting kind of what you do
719
00:30:06.685 --> 00:30:07.885
with all your documents question.
720
00:30:08.545 --> 00:30:13.165
Um, so we started to look into well generating content,
721
00:30:13.645 --> 00:30:16.005
summarizing content, creating whatever, whatever,
722
00:30:16.245 --> 00:30:18.565
whatever the use case was gonna be, um,
723
00:30:18.675 --> 00:30:21.405
that whatever the specific use case was gonna be
724
00:30:21.405 --> 00:30:23.805
to solve the specific problem that somebody had.
725
00:30:24.465 --> 00:30:27.725
Uh, we said we needed a, we needed a test bed,
726
00:30:27.725 --> 00:30:31.005
we needed a way of working, which would allow us to
727
00:30:31.555 --> 00:30:33.685
take a bit more control over
728
00:30:34.545 --> 00:30:37.205
the way the prompts were handled, the, um,
729
00:30:37.775 --> 00:30:40.765
where the data was processed, how the data was kind
730
00:30:40.765 --> 00:30:41.765
of managed around the system.
731
00:30:41.865 --> 00:30:44.165
And so our first point actually was
732
00:30:44.165 --> 00:30:46.885
to solve the security problem that when we get
733
00:30:46.905 --> 00:30:50.445
to real world applications with our data, our customers data
734
00:30:50.465 --> 00:30:53.245
and our suppliers data, where were we gonna do it?
735
00:30:53.245 --> 00:30:55.245
How were we gonna do that? And so we created
736
00:30:55.805 --> 00:30:58.925
a completely secure by design, uh,
737
00:30:59.725 --> 00:31:03.755
generative AI test bed that actually now,
738
00:31:03.895 --> 00:31:05.115
now we call them assistance,
739
00:31:05.175 --> 00:31:07.515
but I suppose they're a bit closer to having kind
740
00:31:07.515 --> 00:31:09.075
of agents already ready
741
00:31:09.075 --> 00:31:11.675
before we were talking about agents a year or so ago.
742
00:31:12.255 --> 00:31:15.235
And, um, and it was a really beneficial way
743
00:31:15.235 --> 00:31:17.595
of bringing different stakeholders from
744
00:31:17.595 --> 00:31:20.355
around the business together, trying out new ideas, trying
745
00:31:20.355 --> 00:31:22.235
to find out actually do those use cases work.
746
00:31:22.935 --> 00:31:24.875
And pretty quickly we ended up with, you know,
747
00:31:24.875 --> 00:31:26.995
that use case has spawned another 10 use cases,
748
00:31:27.375 --> 00:31:28.915
and we've ended up with a bit of a, probably a bit
749
00:31:28.915 --> 00:31:30.075
of a proliferation, really.
750
00:31:30.155 --> 00:31:33.155
I think out of the 400 users we've got used the application,
751
00:31:33.155 --> 00:31:35.315
we've got somewhere in the region of about two to 300
752
00:31:35.975 --> 00:31:38.475
of these separate, um, assistant.
753
00:31:39.285 --> 00:31:41.625
But what it was also helping us to do was to understand
754
00:31:42.085 --> 00:31:44.305
how important was that knowledge management problem?
755
00:31:44.605 --> 00:31:46.305
Is having 10 versions
756
00:31:46.325 --> 00:31:49.665
of a similar document in a team site somewhere, is
757
00:31:49.665 --> 00:31:51.025
that gonna cause us a problem
758
00:31:51.285 --> 00:31:54.385
or will we, will we be okay with that?
759
00:31:54.965 --> 00:31:57.185
And actually what it also helped us identify was
760
00:31:57.185 --> 00:31:58.305
that actually just knowing
761
00:31:58.415 --> 00:32:01.865
that information came from a document in the way that, um,
762
00:32:02.725 --> 00:32:05.105
say copilot type searches work at the moment
763
00:32:05.745 --> 00:32:07.185
actually wasn't good enough that we really needed
764
00:32:07.185 --> 00:32:09.825
to know a bit, bit of a deeper, kind of contextual level
765
00:32:09.905 --> 00:32:11.385
of it came from this document,
766
00:32:11.405 --> 00:32:13.265
but actually it came from this page
767
00:32:13.325 --> 00:32:16.105
or this subheading within, within this document as well.
768
00:32:16.725 --> 00:32:21.345
So having this kind of open test bed that we, that that,
769
00:32:21.345 --> 00:32:24.265
that we own and we operate, has allowed us actually
770
00:32:24.265 --> 00:32:26.185
to an identify, actually there's more questions
771
00:32:26.255 --> 00:32:27.265
that we needed us to answer
772
00:32:27.265 --> 00:32:30.025
before we could really get, get kind of real kind
773
00:32:30.025 --> 00:32:32.025
of good enterprise level benefit from our,
774
00:32:32.055 --> 00:32:33.265
from our knowledge out of it.
775
00:32:33.935 --> 00:32:35.545
It's interesting there, I mean, one of the things
776
00:32:35.545 --> 00:32:38.105
that I found particularly smart around the solution
777
00:32:38.105 --> 00:32:40.565
that you've built is, um, the way, you know, a lot
778
00:32:40.565 --> 00:32:42.965
of our clients talk about wanting to leverage these tools,
779
00:32:42.965 --> 00:32:44.365
but wanting to do so in a secure manner,
780
00:32:44.365 --> 00:32:45.845
which is obviously the point you just touched on.
781
00:32:46.145 --> 00:32:47.365
How did you go about that
782
00:32:47.505 --> 00:32:49.725
and um, how has that made a difference to
783
00:32:49.865 --> 00:32:51.605
how quickly the solution can be adopted?
784
00:32:51.625 --> 00:32:53.925
So I imagine that once you built that trust, the ability
785
00:32:53.925 --> 00:32:55.005
to roll that solution out
786
00:32:55.005 --> 00:32:57.805
and scale was, was much easier than otherwise would've been.
787
00:32:58.185 --> 00:33:00.765
The solution was almost done outta necessity for making,
788
00:33:00.905 --> 00:33:02.765
you know, rapid, rapid progress.
789
00:33:02.985 --> 00:33:07.925
Um, and so the, the, the approach to kind of virtualizing
790
00:33:08.185 --> 00:33:11.005
and then, um, chunking up the, uh, the data
791
00:33:11.005 --> 00:33:12.965
that was brought in allowed users
792
00:33:13.025 --> 00:33:15.645
to very quickly work on a project by project basis.
793
00:33:15.745 --> 00:33:17.525
And in our context, most of
794
00:33:17.525 --> 00:33:18.725
what we do is project by project.
795
00:33:18.825 --> 00:33:21.245
We don't need sort of a huge amount of
796
00:33:22.045 --> 00:33:23.445
learning from one project to the next.
797
00:33:23.465 --> 00:33:26.365
But yes, the methodology that we might have applied
798
00:33:26.365 --> 00:33:28.445
to a customer problem, yes, we wanna use that the next time,
799
00:33:28.505 --> 00:33:31.565
but actually that customer's document it pretty much,
800
00:33:31.585 --> 00:33:32.925
we don't, we don't touch,
801
00:33:32.945 --> 00:33:35.205
we can't touch from one project to another.
802
00:33:35.865 --> 00:33:37.925
Um, so our solution had to kind of mirror that
803
00:33:37.985 --> 00:33:39.005
to a degree as well.
804
00:33:39.505 --> 00:33:41.485
But also it, it helped us to identify
805
00:33:41.815 --> 00:33:44.405
where it would be important to have kind
806
00:33:44.405 --> 00:33:47.105
of institutional data sets.
807
00:33:47.105 --> 00:33:48.865
Being able to connect to, um,
808
00:33:49.005 --> 00:33:53.185
an entire SharePoint library full of curated content, um,
809
00:33:53.445 --> 00:33:55.745
was, was, was really, really, was really, really key.
810
00:33:56.165 --> 00:33:58.145
But also with the introduction of, um,
811
00:33:58.145 --> 00:34:02.145
because we also adopted, um, M 365 copilot for a large part
812
00:34:02.145 --> 00:34:05.945
of the business as well, that at the same time has helped us
813
00:34:05.945 --> 00:34:07.905
to understand the difference between what does it mean
814
00:34:07.905 --> 00:34:09.305
for me for personal productivity
815
00:34:09.315 --> 00:34:12.025
where it can access everything within the,
816
00:34:12.085 --> 00:34:14.465
the Microsoft graph versus, well,
817
00:34:14.465 --> 00:34:15.945
when I'm working on a customer project,
818
00:34:16.035 --> 00:34:18.025
maybe I'll still use some of those features,
819
00:34:18.045 --> 00:34:22.225
but maybe there's some other aspects, um, around our, um,
820
00:34:22.245 --> 00:34:24.225
around our AI tools that actually are,
821
00:34:24.365 --> 00:34:25.905
are really beneficial in that way.
822
00:34:26.285 --> 00:34:27.865
And we're finding it that actually it's working,
823
00:34:28.695 --> 00:34:33.065
it's working really well to give our staff, um, lots
824
00:34:33.065 --> 00:34:35.745
of different AI tools to do different jobs.
825
00:34:36.185 --> 00:34:39.145
I think starting off with, you're gonna have one platform
826
00:34:39.145 --> 00:34:41.585
that's gonna do absolutely everything is, is probably
827
00:34:41.585 --> 00:34:43.225
where people hoped we were gonna start off from.
828
00:34:43.245 --> 00:34:46.465
And, and it may be in, in a few years time we'll get
829
00:34:46.465 --> 00:34:47.505
to a vision like that,
830
00:34:47.565 --> 00:34:50.425
but today we're sort of finding that it's your right tool
831
00:34:50.425 --> 00:34:53.385
for the job and, um, and go that way.
832
00:34:54.095 --> 00:34:56.705
Amazing. I suppose. So looking forward a little bit,
833
00:34:56.875 --> 00:35:00.145
where do you see BM T'S use of AI in say, 12 months time?
834
00:35:01.465 --> 00:35:03.085
12 months feels like a very long time.
835
00:35:03.465 --> 00:35:08.435
Um, our next big focus is really into
836
00:35:08.435 --> 00:35:10.555
looking into kinda human agent teaming.
837
00:35:10.925 --> 00:35:14.635
Where does, where do we start to move from, um, lots
838
00:35:14.635 --> 00:35:17.555
of assistance, lots of kind of personal productivity hacks
839
00:35:18.265 --> 00:35:19.955
into, okay, this is,
840
00:35:20.225 --> 00:35:21.755
this is gonna be the way that we do something.
841
00:35:21.755 --> 00:35:23.915
When you're doing this activity, you know,
842
00:35:23.915 --> 00:35:26.075
you use this agent when you're doing this activity
843
00:35:26.175 --> 00:35:29.075
that's human led, and you work together to,
844
00:35:29.135 --> 00:35:31.995
to achieve an outcome that certainly, you know, is,
845
00:35:32.055 --> 00:35:33.675
is the next, is the next year for us.
846
00:35:33.735 --> 00:35:36.915
But also worrying about it in a very, in a very human way.
847
00:35:37.425 --> 00:35:40.165
Um, we're, we're very, very clear with our, with our,
848
00:35:40.265 --> 00:35:42.485
our employees that, that for us, this is not about,
849
00:35:42.485 --> 00:35:43.805
you know, taking people out the loop,
850
00:35:43.835 --> 00:35:48.205
it's about freeing up their time to do other activity, um,
851
00:35:48.375 --> 00:35:50.525
which is, you know, ultimately more, more valuable,
852
00:35:50.525 --> 00:35:52.685
but also with our customers being really clear that there's,
853
00:35:52.685 --> 00:35:55.565
there's still people quality assuring your outputs here we
854
00:35:55.565 --> 00:35:58.645
are, we're not just, you know, blindly, blindly relying on,
855
00:35:58.645 --> 00:36:01.085
um, on, on AI to, to do our work for us.
856
00:36:02.125 --> 00:36:05.545
And I suppose, um, you know, one of the things that's,
857
00:36:05.545 --> 00:36:07.065
that's probably pertinent to many
858
00:36:07.065 --> 00:36:09.945
of our listeners being business leaders, um, what are some
859
00:36:09.945 --> 00:36:10.945
of the sort of challenges
860
00:36:10.945 --> 00:36:12.585
that you faced along this journey so far
861
00:36:14.285 --> 00:36:15.535
From a people perspective?
862
00:36:15.785 --> 00:36:17.615
We'll start off with that. We'll start off there.
863
00:36:18.815 --> 00:36:21.515
Two completely different ends of a spectrum, um,
864
00:36:21.535 --> 00:36:23.235
across the whole kind of adoption
865
00:36:23.255 --> 00:36:26.475
and change curve we had the, the early adopters,
866
00:36:26.495 --> 00:36:28.755
the innovators, you're just not going fast
867
00:36:28.755 --> 00:36:29.835
enough, you know, you need to go fast.
868
00:36:29.895 --> 00:36:32.755
The world, it's changing. And there's a, there's a degree
869
00:36:32.775 --> 00:36:35.235
of, obviously there's a degree of truth in that Yeah.
870
00:36:35.235 --> 00:36:37.475
That we see however many, it's been,
871
00:36:37.475 --> 00:36:39.995
I think 12 different AI models released in the last,
872
00:36:39.995 --> 00:36:41.235
the last three months or so.
873
00:36:41.935 --> 00:36:44.755
And, and yes, yeah, the pace of change is, is really quite,
874
00:36:44.805 --> 00:36:46.315
quite rapid in some of those areas.
875
00:36:47.185 --> 00:36:51.765
But actually the pace at which real world change is,
876
00:36:51.825 --> 00:36:55.805
is coming is really, really hard to quantify.
877
00:36:55.805 --> 00:36:58.925
And I don't think we'll really know kind of exactly how fast
878
00:36:58.925 --> 00:37:00.565
that change has been until we're looking, looking,
879
00:37:00.565 --> 00:37:01.605
looking back down it.
880
00:37:01.705 --> 00:37:03.005
So on one side we had this,
881
00:37:03.545 --> 00:37:05.445
you're going too slowly, you need to move fast.
882
00:37:05.545 --> 00:37:07.245
And on the other side there was then the,
883
00:37:07.755 --> 00:37:08.925
well, it's gonna take my job.
884
00:37:09.005 --> 00:37:10.285
I, I want nothing to do with it.
885
00:37:10.385 --> 00:37:13.045
And what we've had to do is, is come up with different
886
00:37:13.925 --> 00:37:17.085
approaches and solutions to allow kind of both
887
00:37:17.085 --> 00:37:18.565
of those realities to exist.
888
00:37:18.945 --> 00:37:21.125
Not alien a the ones who don't want to use it,
889
00:37:21.125 --> 00:37:23.805
but also don't push the ones who do want
890
00:37:23.805 --> 00:37:25.725
to use it into some sort of shadow economy of,
891
00:37:25.945 --> 00:37:27.925
of using their own, you know, bring,
892
00:37:27.975 --> 00:37:30.205
bring in their own AI tools to work and,
893
00:37:30.345 --> 00:37:32.085
and actually then we lose even more control
894
00:37:32.085 --> 00:37:33.125
over our data. Yeah.
895
00:37:33.695 --> 00:37:35.195
Um, so one of the things that a lot
896
00:37:35.195 --> 00:37:36.955
of our customers tell us is that, um, you know,
897
00:37:36.955 --> 00:37:40.475
measuring business value from AI is increasingly difficult,
898
00:37:40.655 --> 00:37:43.035
um, rather than becoming easier, you know,
899
00:37:43.035 --> 00:37:44.035
how have you navigated that?
900
00:37:44.315 --> 00:37:46.315
I think it's incredibly difficult.
901
00:37:46.655 --> 00:37:49.415
Um, but it's not to say impossible.
902
00:37:49.755 --> 00:37:52.855
So our approach has been to look at it in form of sort
903
00:37:52.855 --> 00:37:54.495
of leading and lagging sort of indicators.
904
00:37:55.155 --> 00:37:58.775
Um, particularly where, particularly
905
00:37:58.775 --> 00:38:01.255
where we could look at sort of a, a leading indicator
906
00:38:01.255 --> 00:38:02.815
of being something like we've rolled out
907
00:38:02.815 --> 00:38:03.855
tools, are they being used?
908
00:38:04.365 --> 00:38:06.495
Okay, great, they are being used. Fantastic.
909
00:38:06.675 --> 00:38:10.615
We know we've got, um, a couple of hundred users
910
00:38:10.795 --> 00:38:14.135
who are on average using it every single week.
911
00:38:14.925 --> 00:38:18.195
Um, fantastic. But what are they doing?
912
00:38:18.575 --> 00:38:20.235
And some of the tools we've been able
913
00:38:20.235 --> 00:38:22.595
to build have been able to really clearly kind of metate,
914
00:38:23.085 --> 00:38:24.415
they've done this activity
915
00:38:24.555 --> 00:38:27.895
and we predict this activity took this long normally,
916
00:38:28.595 --> 00:38:29.695
and we now know actually
917
00:38:29.695 --> 00:38:31.495
that the model does it this quickly.
918
00:38:31.635 --> 00:38:34.695
So we can start to now pull together a bit of, a bit of a,
919
00:38:34.695 --> 00:38:36.485
uh, bit of a leading indicator of, well, this,
920
00:38:36.485 --> 00:38:37.725
this should be saving time.
921
00:38:38.505 --> 00:38:40.965
In terms of though, of, do you see that dropping
922
00:38:40.985 --> 00:38:42.925
to the bottom line, increasing revenue?
923
00:38:43.745 --> 00:38:47.285
Uh, they, I think the world is so volatile around all
924
00:38:47.285 --> 00:38:50.005
of the, these things that, um, it's been really, really hard
925
00:38:50.025 --> 00:38:53.565
to see any great changes one way, one way or the other.
926
00:38:54.325 --> 00:38:56.005
A lot of the things we're looking at are things
927
00:38:56.005 --> 00:38:57.965
that happen over quite a long period of time.
928
00:38:58.185 --> 00:39:01.085
So are we getting better at writing proposals,
929
00:39:01.385 --> 00:39:02.525
um, for customers?
930
00:39:03.585 --> 00:39:06.075
Well, yes, maybe, maybe our, our probability,
931
00:39:06.215 --> 00:39:08.235
our win rate's gone up fantastic.
932
00:39:08.935 --> 00:39:11.355
Uh, but at the same time as well, we can measure
933
00:39:11.355 --> 00:39:13.995
that maybe our cost of bidding has gone down.
934
00:39:14.875 --> 00:39:17.935
But often what we found is, is that the, the space
935
00:39:17.935 --> 00:39:19.175
that we are creating is getting
936
00:39:19.175 --> 00:39:20.295
filled with some other activity.
937
00:39:20.475 --> 00:39:22.455
So actually we're just bidding slightly more than we were
938
00:39:22.455 --> 00:39:23.735
bidding, bidding previously.
939
00:39:24.515 --> 00:39:27.135
So in one way, I'd say, yes, we are, we are seeing,
940
00:39:27.155 --> 00:39:29.055
we are seeing value, um,
941
00:39:29.055 --> 00:39:30.815
we're definitely seeing value in a leading way
942
00:39:30.975 --> 00:39:33.775
that we can identify that yes, we are doing activity,
943
00:39:33.825 --> 00:39:35.575
which is displacing something else,
944
00:39:36.835 --> 00:39:38.615
but that falling to the bottom line, I,
945
00:39:38.735 --> 00:39:41.015
I think we're still quite a long way away from really kind
946
00:39:41.015 --> 00:39:42.615
of seeing kind of that sort of tangible,
947
00:39:43.135 --> 00:39:44.295
tangible benefits to business.
948
00:39:44.725 --> 00:39:45.855
It's an interesting point then. 'cause
949
00:39:45.855 --> 00:39:47.735
what you were ultimately describing there is really a sort
950
00:39:47.735 --> 00:39:49.135
of trust based investment.
951
00:39:49.435 --> 00:39:52.095
So you talked earlier around how you're quite fortunate
952
00:39:52.095 --> 00:39:53.775
to have leadership that are willing to do that.
953
00:39:54.075 --> 00:39:57.135
Um, I suppose for those that maybe haven't experienced that,
954
00:39:57.675 --> 00:40:00.135
um, what's some ways that they can maybe encourage their,
955
00:40:00.225 --> 00:40:01.895
their colleagues, their leadership to,
956
00:40:01.995 --> 00:40:03.015
to lead on that basis?
957
00:40:04.745 --> 00:40:07.485
You've gotta find something that's a proxy
958
00:40:07.485 --> 00:40:09.725
that you care about, something that you can measure.
959
00:40:09.945 --> 00:40:12.485
So it might be that your business is obsessed with,
960
00:40:13.145 --> 00:40:14.645
uh, customer satisfaction.
961
00:40:15.635 --> 00:40:17.885
Okay. What goes into color customer satisfaction?
962
00:40:17.885 --> 00:40:20.565
Well, if, if a key part of customer satisfaction is
963
00:40:20.565 --> 00:40:22.805
how quickly you can return round a, a,
964
00:40:23.165 --> 00:40:25.725
a request from a customer for something, you can measure
965
00:40:25.725 --> 00:40:29.005
that and then you can build an AI use case quite quickly
966
00:40:29.105 --> 00:40:30.445
to say, well, we know
967
00:40:30.445 --> 00:40:33.005
that our customers are getting stuck at this point in the
968
00:40:33.005 --> 00:40:36.045
process that they maybe do an inquiry
969
00:40:36.665 --> 00:40:38.445
and it takes us two days to get back to them.
970
00:40:38.445 --> 00:40:40.525
Well, if I can, if I can reduce that down to a couple
971
00:40:40.525 --> 00:40:41.565
of minutes and get 'em
972
00:40:41.565 --> 00:40:43.125
to take the next step, I can really show that.
973
00:40:43.705 --> 00:40:46.925
So we, we did not wholly do anything out of, out
974
00:40:46.925 --> 00:40:48.085
of, out of blind trust at all.
975
00:40:48.225 --> 00:40:51.205
We didn't commit to spending any money until we'd started
976
00:40:51.305 --> 00:40:53.845
to identify, well, what's, what's gonna be the key thing
977
00:40:53.845 --> 00:40:56.245
that's gonna give us a, a return on investment?
978
00:40:56.945 --> 00:40:59.525
Um, the trick I would say though, is to find the thing
979
00:40:59.525 --> 00:41:01.685
that you can all gravitate around that you all agree to,
980
00:41:02.095 --> 00:41:04.085
which has also then got some legs
981
00:41:04.085 --> 00:41:06.085
that get you into some other things.
982
00:41:06.425 --> 00:41:09.565
So we invested in a platform which would allow us
983
00:41:09.565 --> 00:41:13.485
predominantly to be, uh, better at our proposal writing,
984
00:41:15.695 --> 00:41:17.195
but then we opened that up to the whole crowd
985
00:41:17.195 --> 00:41:18.475
and said, look, anyone can use this.
986
00:41:18.495 --> 00:41:21.355
If you are in systems engineering, in naval architecture,
987
00:41:21.355 --> 00:41:23.915
if you're in the the HR team, it doesn't matter.
988
00:41:23.975 --> 00:41:26.355
You, you can use this. And if you can find your own use
989
00:41:26.355 --> 00:41:29.595
cases, then that's just additional benefit on top
990
00:41:29.595 --> 00:41:31.595
of actually what we built the business case around.
991
00:41:32.735 --> 00:41:34.725
So I suppose there the, you know, the opportunity
992
00:41:34.745 --> 00:41:37.205
to maximize the through democratization.
993
00:41:37.205 --> 00:41:38.885
So we talk to customers a lot around, you know,
994
00:41:38.885 --> 00:41:40.645
finding use cases that probably have
995
00:41:41.165 --> 00:41:42.285
parallel use cases that are similar.
996
00:41:42.305 --> 00:41:44.485
So that's obviously an example there that you've shared.
997
00:41:45.145 --> 00:41:48.125
Um, I think going beyond here, um,
998
00:41:48.265 --> 00:41:50.885
we haven't talked in too much detail around the,
999
00:41:51.065 --> 00:41:53.365
the specific solution that BMT have deployed.
1000
00:41:53.465 --> 00:41:54.485
So can you give us a little bit
1001
00:41:54.485 --> 00:41:56.285
of a view into perhaps the technologies
1002
00:41:56.345 --> 00:41:58.005
and then the business problem that it solves?
1003
00:41:58.705 --> 00:42:00.725
One of our major challenges we had was around
1004
00:42:01.275 --> 00:42:03.485
data security, intellectual property.
1005
00:42:03.505 --> 00:42:07.245
And so we actually started our solution around bounding,
1006
00:42:08.125 --> 00:42:10.645
bounding around how we would overcome that problem,
1007
00:42:11.105 --> 00:42:13.925
but also recognizing that if we didn't plug the gap of
1008
00:42:14.625 --> 00:42:17.285
AI tools being available for our, our staff in a,
1009
00:42:17.285 --> 00:42:20.795
in a secure way, we expected there's gonna be lots
1010
00:42:20.795 --> 00:42:22.315
of people bringing their own ais to work.
1011
00:42:22.415 --> 00:42:25.275
So our, our, our solution initially was to look at, well,
1012
00:42:25.275 --> 00:42:29.865
where, where could we build an application that can make use
1013
00:42:29.865 --> 00:42:32.505
of a large language model in a secure way?
1014
00:42:32.645 --> 00:42:35.385
And if we go back a couple of years, uh,
1015
00:42:35.865 --> 00:42:38.505
Microsoft had just announced the J open AI services were
1016
00:42:38.505 --> 00:42:40.465
going to be going to be rolled, rolled out.
1017
00:42:41.205 --> 00:42:42.785
And we looked ahead
1018
00:42:42.785 --> 00:42:45.185
to when it was gonna be rolled out in our data center, um,
1019
00:42:45.185 --> 00:42:47.385
here in the UK that we, we base, um,
1020
00:42:47.545 --> 00:42:49.545
kinda our headquarters of the business around.
1021
00:42:50.405 --> 00:42:52.545
And we fixed on that as being, okay, as soon as
1022
00:42:52.545 --> 00:42:55.345
that's available, we want to start looking at creating a,
1023
00:42:55.525 --> 00:42:59.625
uh, a web application where we can secure the,
1024
00:42:59.685 --> 00:43:01.505
secure the backend, secure the endpoint,
1025
00:43:01.615 --> 00:43:04.265
make sure the infrastructures rugged and robust.
1026
00:43:04.265 --> 00:43:06.705
Because we were, we were confident that
1027
00:43:07.505 --> 00:43:09.665
actually the enterprise data protection standards
1028
00:43:09.665 --> 00:43:11.145
that Microsoft apply across its apps,
1029
00:43:11.325 --> 00:43:14.665
we could also apply within the RUR environments
1030
00:43:15.125 --> 00:43:18.145
and make use of these services in a very, very secure way.
1031
00:43:18.995 --> 00:43:20.695
Uh, so that was really the big first step with,
1032
00:43:20.695 --> 00:43:23.935
with transparency, was, was how would we go about creating,
1033
00:43:24.675 --> 00:43:27.255
was actually a very, very simple chat application.
1034
00:43:27.325 --> 00:43:30.495
What we just wanted was effectively Microsoft Co-pilot
1035
00:43:30.495 --> 00:43:32.735
as it was, or bing bing chat as it was back then.
1036
00:43:33.235 --> 00:43:36.215
Uh, we just wanted that in a secure way where I wanted
1037
00:43:36.215 --> 00:43:37.695
to be able to upload documents
1038
00:43:38.075 --> 00:43:40.655
and ask questions around the documents and,
1039
00:43:40.795 --> 00:43:44.015
and information from the, the large language model itself
1040
00:43:44.885 --> 00:43:47.895
very quickly that then spawned out into the idea
1041
00:43:47.915 --> 00:43:51.135
of creating, um, templated assistance.
1042
00:43:51.315 --> 00:43:53.535
So actually, if I had somebody who was less comfortable
1043
00:43:53.535 --> 00:43:57.055
with writing really good prompts, how could we start to, um,
1044
00:43:57.245 --> 00:43:59.295
hard wire those things for people?
1045
00:43:59.555 --> 00:44:01.295
So actually if you were less comfortable,
1046
00:44:01.295 --> 00:44:04.575
you could come along, I need to use this template,
1047
00:44:04.725 --> 00:44:06.015
this is how I input into it.
1048
00:44:06.015 --> 00:44:07.415
And I guess a response out of it.
1049
00:44:08.035 --> 00:44:12.015
So our, our journey really went from very basic chat very
1050
00:44:12.015 --> 00:44:13.295
quickly into this level,
1051
00:44:13.315 --> 00:44:14.935
and it's continued to evolve since then.
1052
00:44:14.935 --> 00:44:17.655
We've looked at how do we bring in new, new, uh,
1053
00:44:17.835 --> 00:44:19.295
new data sources, how do we,
1054
00:44:19.595 --> 00:44:21.495
how harden the infrastructure further?
1055
00:44:21.955 --> 00:44:23.775
How do we get that globally deployable?
1056
00:44:23.775 --> 00:44:25.735
Because in our business, um, every,
1057
00:44:26.185 --> 00:44:27.375
we're in multiple countries
1058
00:44:27.375 --> 00:44:29.175
and every country has its own data
1059
00:44:29.175 --> 00:44:30.815
protection sovereignty laws.
1060
00:44:31.265 --> 00:44:33.925
And so we have to have a solution which actually we're able
1061
00:44:33.925 --> 00:44:36.405
to deploy in and coordinate in that sort of way,
1062
00:44:36.865 --> 00:44:39.845
whilst also, you know, leveraging some of the great work
1063
00:44:39.845 --> 00:44:41.485
that happens all around the world.
1064
00:44:41.505 --> 00:44:44.445
And so we wanna be able to share these great templates
1065
00:44:44.445 --> 00:44:45.605
for assistance with other people
1066
00:44:45.705 --> 00:44:48.805
and, um, kind of leverage the collective crowd
1067
00:44:48.805 --> 00:44:51.605
around the world, not just, not just have it fixated on,
1068
00:44:52.025 --> 00:44:54.165
you know, one country or two countries in the group.
1069
00:44:55.265 --> 00:44:57.965
And of all that work that you've done, I think, uh,
1070
00:44:58.275 --> 00:45:01.525
listeners tend to to love, um, some real world stories.
1071
00:45:01.885 --> 00:45:04.205
I think, uh, generally always ask everybody.
1072
00:45:04.705 --> 00:45:07.765
Any particular comments that you've had from end users
1073
00:45:07.765 --> 00:45:09.765
that have used the tool that have really stuck with you
1074
00:45:09.865 --> 00:45:11.205
or things that you're quite proud of
1075
00:45:11.205 --> 00:45:12.365
around the, the deployment?
1076
00:45:15.305 --> 00:45:18.785
I think probably the best day has been when we were
1077
00:45:20.485 --> 00:45:22.675
presenting, uh, to one
1078
00:45:22.675 --> 00:45:24.915
of our main customers in the, in the Ministry of Defense.
1079
00:45:24.935 --> 00:45:27.635
So kind of very senior, very senior civil servant.
1080
00:45:28.335 --> 00:45:31.555
And he was really interested in, um, in innovation
1081
00:45:31.735 --> 00:45:34.995
and kind of particularly with engineering, um, services.
1082
00:45:35.135 --> 00:45:37.075
And we were, we were talking about this particular,
1083
00:45:37.075 --> 00:45:39.395
particular project about what, what we'd, what we'd done.
1084
00:45:40.265 --> 00:45:42.245
And I remember when he said, um,
1085
00:45:43.915 --> 00:45:45.335
but this is just a, a mock
1086
00:45:45.335 --> 00:45:46.415
up you were showing me on the screen.
1087
00:45:46.615 --> 00:45:49.295
'cause it was a, it was in a PowerPoint presentation, said,
1088
00:45:49.295 --> 00:45:51.215
well, it is, but do you wanna see the real thing?
1089
00:45:51.755 --> 00:45:53.975
And we just flick it up on, on, on screen.
1090
00:45:53.975 --> 00:45:56.215
And this was, this was about 18 months or so ago,
1091
00:45:56.875 --> 00:45:58.575
and, uh, we started running through some
1092
00:45:58.575 --> 00:46:01.655
of the use cases we particularly had about how we would, um,
1093
00:46:02.235 --> 00:46:04.495
uh, assess a new cus set of customer requirements,
1094
00:46:04.495 --> 00:46:06.695
how we'd break, break it apart, make sure
1095
00:46:06.695 --> 00:46:09.935
that we are answering every single question, um, look at
1096
00:46:10.275 --> 00:46:12.055
how we would improve what they'd given us.
1097
00:46:12.075 --> 00:46:14.655
And actually, what he was really taken by was the fact
1098
00:46:14.655 --> 00:46:16.775
that it was very practical what we were doing.
1099
00:46:17.235 --> 00:46:20.055
It mirrored what they would do on the other side of, of, of,
1100
00:46:20.075 --> 00:46:22.415
of the fences that were, but also they could see
1101
00:46:22.415 --> 00:46:26.015
that we hadn't completely given up our, our souls.
1102
00:46:26.015 --> 00:46:27.455
We hadn't taken all the humans outta the loop.
1103
00:46:27.475 --> 00:46:29.215
You know, there were still people out there who were then,
1104
00:46:29.355 --> 00:46:31.135
oh yeah, okay, I need to action now, right?
1105
00:46:31.135 --> 00:46:33.215
I'll go and change it and I'll go and run it again.
1106
00:46:33.555 --> 00:46:36.855
But to be able to show a, you know, a senior customer, um,
1107
00:46:36.875 --> 00:46:38.735
who was absolutely blown away that you,
1108
00:46:38.735 --> 00:46:41.495
this 40-year-old kind of engineering company that, that he,
1109
00:46:41.515 --> 00:46:42.695
he, he really relied on
1110
00:46:42.695 --> 00:46:44.455
for engineering services actually was kind
1111
00:46:44.455 --> 00:46:47.055
of at the leading edge of, of, of these sorts of tools and,
1112
00:46:47.075 --> 00:46:49.135
and, and it being part of our, part
1113
00:46:49.135 --> 00:46:50.775
of our delivery. Um, back to him.
1114
00:46:51.315 --> 00:46:54.575
No, it's amazing to hear. So I suppose closing out, um,
1115
00:46:55.195 --> 00:46:57.455
what's the one piece of advice that you'd give
1116
00:46:57.455 --> 00:46:59.335
to somebody starting their AI journey? I think
1117
00:46:59.335 --> 00:47:01.495
There's two, I think there's two, there's two tips.
1118
00:47:01.735 --> 00:47:05.855
I think start off with what's unacceptable to you.
1119
00:47:06.435 --> 00:47:09.895
Um, if there's particular red lines around, um,
1120
00:47:11.575 --> 00:47:12.765
where you're gonna hold data,
1121
00:47:12.765 --> 00:47:14.845
where you're not gonna hold data, how open you're gonna be
1122
00:47:14.905 --> 00:47:18.565
to other tools that are out there, um, that's really,
1123
00:47:18.565 --> 00:47:21.285
really key because it, it sets kind of the, the,
1124
00:47:21.425 --> 00:47:24.085
the whole cascade of policies
1125
00:47:24.225 --> 00:47:28.685
and procedures around, um, am I gonna allow everyone to come
1126
00:47:28.685 --> 00:47:29.965
and use any tool that they want?
1127
00:47:30.065 --> 00:47:31.765
Or do I need to provide them a solution
1128
00:47:31.765 --> 00:47:34.365
that does absolutely everything Once you've started to
1129
00:47:34.975 --> 00:47:39.635
tackle the tech stack problem, um, it is all about then, um,
1130
00:47:40.345 --> 00:47:41.955
just trying, trying, trying.
1131
00:47:41.955 --> 00:47:44.395
Yeah, you just gotta, you gotta, you gotta keep on, you,
1132
00:47:44.395 --> 00:47:46.555
keep on testing out small use cases,
1133
00:47:46.745 --> 00:47:49.355
getting more people involved, give them a bit of,
1134
00:47:49.425 --> 00:47:51.315
give them the tools, give them the opportunity,
1135
00:47:51.425 --> 00:47:53.875
give 'em the space to, to try things out,
1136
00:47:54.095 --> 00:47:56.515
and they will come up with more ideas than any sort
1137
00:47:56.515 --> 00:47:59.555
of top down, um, community
1138
00:47:59.555 --> 00:48:02.075
of clever people will ever, will ever be able to do.
1139
00:48:02.495 --> 00:48:03.595
And really, they're the sort of,
1140
00:48:03.705 --> 00:48:05.275
they're the source of insight.
1141
00:48:05.375 --> 00:48:06.875
And I think capturing that
1142
00:48:07.615 --> 00:48:09.995
and, you know, bottling that, finding out
1143
00:48:09.995 --> 00:48:11.795
where there's things that we need to put in place
1144
00:48:11.815 --> 00:48:13.955
to maybe improve the core application,
1145
00:48:14.815 --> 00:48:18.595
change a business policy, provide more data on something,
1146
00:48:18.615 --> 00:48:19.715
oh, I need more training.
1147
00:48:19.775 --> 00:48:21.035
It, it becomes then about, you know,
1148
00:48:21.035 --> 00:48:22.515
how do you enable the crowd of people?
1149
00:48:22.515 --> 00:48:26.755
Because ultimately, you know, AI adoption over the next,
1150
00:48:26.775 --> 00:48:29.075
you know, year or two, particularly as you start
1151
00:48:29.075 --> 00:48:31.075
to move into human agent teaming,
1152
00:48:31.175 --> 00:48:33.355
and what does the future of that look like as well?
1153
00:48:33.865 --> 00:48:37.195
It's all about, you know, empowering the big group of people
1154
00:48:37.465 --> 00:48:38.795
with how they're gonna use the tools
1155
00:48:38.935 --> 00:48:40.875
and not really so much power.
1156
00:48:40.875 --> 00:48:42.955
The technology, the technology's gonna keep on advancing
1157
00:48:42.955 --> 00:48:44.915
whether or not, you know, we are using it or not,
1158
00:48:45.335 --> 00:48:47.035
but it's all about, you know, how you're gonna get those,
1159
00:48:47.035 --> 00:48:48.395
those guys to do more with it.
1160
00:48:48.885 --> 00:48:51.595
Simon, thanks so much for your time and insights today
1161
00:48:51.735 --> 00:48:53.115
and your partnership as always.
1162
00:48:53.465 --> 00:48:57.155
Yeah, thank you Jody.
1163
00:48:57.155 --> 00:48:58.555
Welcome to the Transform Podcast
1164
00:48:58.935 --> 00:49:00.195
and thank you for joining me today.
1165
00:49:00.315 --> 00:49:01.555
I obviously know you very well,
1166
00:49:02.055 --> 00:49:04.395
but for those that don't, what exactly is
1167
00:49:04.435 --> 00:49:05.555
a Chief AI officer?
1168
00:49:05.905 --> 00:49:07.075
What do you do at Transparency?
1169
00:49:07.435 --> 00:49:09.355
I think it varies, right? So I think if I was customer
1170
00:49:09.355 --> 00:49:11.075
facing, it would probably be slightly different to
1171
00:49:11.075 --> 00:49:14.075
what I do at Transparency, but for me, in my role here,
1172
00:49:14.465 --> 00:49:17.435
it's really around making sure we're actually walking the
1173
00:49:17.435 --> 00:49:18.795
walk, not just talking the talk.
1174
00:49:18.895 --> 00:49:21.075
So looking our internal AI policy,
1175
00:49:21.455 --> 00:49:23.715
making sure we're actually leveraging AI internally
1176
00:49:23.775 --> 00:49:25.435
to become more efficient and more innovative,
1177
00:49:25.775 --> 00:49:27.155
but then also helping our clients
1178
00:49:27.155 --> 00:49:30.155
with their digital transformation needs and adopting AI
1179
00:49:30.155 --> 00:49:33.075
and AI strategy, but also, I guess more importantly
1180
00:49:33.075 --> 00:49:35.275
as a partner, our interfacing into Microsoft
1181
00:49:35.375 --> 00:49:37.675
and, um, building our kind of relationship
1182
00:49:37.675 --> 00:49:40.475
and market presence as a Microsoft Solutions partner.
1183
00:49:40.825 --> 00:49:42.995
Awesome. So we're gonna do a little bit
1184
00:49:42.995 --> 00:49:45.035
of myth busting later in our section today.
1185
00:49:45.345 --> 00:49:47.555
Nice to try and help customers out on their journey.
1186
00:49:48.215 --> 00:49:49.635
Um, but I suppose more broadly,
1187
00:49:49.945 --> 00:49:51.995
what are you hearing from organizations at the moment
1188
00:49:51.995 --> 00:49:53.275
around AI adoption?
1189
00:49:53.275 --> 00:49:55.195
Where are people going? What's most interesting?
1190
00:49:55.335 --> 00:49:56.355
You know, what's getting you
1191
00:49:56.355 --> 00:49:57.635
excited in the industry right now?
1192
00:49:58.035 --> 00:50:00.075
I think it's good that it's very much top of mind,
1193
00:50:00.095 --> 00:50:01.235
so it is keeping me busy.
1194
00:50:01.775 --> 00:50:06.555
Um, but I think it's, it's a real, like varied depending on
1195
00:50:06.555 --> 00:50:08.315
where clients are in their kind
1196
00:50:08.315 --> 00:50:10.075
of digital transformation journey, right?
1197
00:50:10.595 --> 00:50:13.435
Customers who have been leveraging AI for decades,
1198
00:50:13.585 --> 00:50:15.075
like machine learning and data science
1199
00:50:15.615 --> 00:50:20.075
are obviously a lot more well prepared for generative ai.
1200
00:50:20.415 --> 00:50:22.715
Um, they've already got that kind of culture instilled,
1201
00:50:22.785 --> 00:50:24.755
they already have those operating models in place.
1202
00:50:25.095 --> 00:50:26.435
So really it's just, okay,
1203
00:50:26.435 --> 00:50:28.445
how do we also then leverage a different type
1204
00:50:28.445 --> 00:50:29.845
of ai, a new technique?
1205
00:50:30.265 --> 00:50:32.005
Um, whereas I think organizations
1206
00:50:32.005 --> 00:50:35.685
who are just getting started, there's a mixture of
1207
00:50:36.965 --> 00:50:39.365
analysis paralysis of, oh, everything needs
1208
00:50:39.365 --> 00:50:40.605
to be perfect before we get going.
1209
00:50:41.285 --> 00:50:42.805
I think, you know, a lot
1210
00:50:42.805 --> 00:50:45.005
of organizations are doing a great job of marketing,
1211
00:50:45.145 --> 00:50:47.645
so there's a lot of kind of, you know,
1212
00:50:47.645 --> 00:50:49.805
scaremongering in terms of, you know,
1213
00:50:49.825 --> 00:50:50.845
you should be doing this.
1214
00:50:50.945 --> 00:50:54.525
And so again, there's that feeling within the market.
1215
00:50:54.545 --> 00:50:56.565
And then I think you've got companies that are in the middle
1216
00:50:57.065 --> 00:51:01.325
who are experimenting, learning, iterating,
1217
00:51:02.125 --> 00:51:03.605
starting to derive some business value.
1218
00:51:03.625 --> 00:51:06.565
And that feels like a good spot to be really.
1219
00:51:07.745 --> 00:51:10.405
So I suppose you'll have seen mixed uh,
1220
00:51:10.405 --> 00:51:12.325
experiences from our client base so far.
1221
00:51:12.395 --> 00:51:15.285
Yeah. Um, I suppose, you know, let's talk some,
1222
00:51:15.285 --> 00:51:16.605
maybe some, some horror stories.
1223
00:51:16.615 --> 00:51:18.245
Let's talk perhaps some successes.
1224
00:51:18.245 --> 00:51:20.685
Where are you seeing customers go right and go wrong
1225
00:51:20.865 --> 00:51:22.765
and, you know, what are your initial tips for people
1226
00:51:22.765 --> 00:51:24.285
that might be listening in terms of
1227
00:51:24.285 --> 00:51:26.125
how they can start the journey on the right footing?
1228
00:51:26.435 --> 00:51:29.685
Yeah, I think clients are, who are kind of doing well
1229
00:51:29.785 --> 00:51:32.925
and would've spoke to some today, um, I think it's
1230
00:51:33.665 --> 00:51:35.645
not just thinking about AI as technology.
1231
00:51:36.085 --> 00:51:38.765
I think there's a real common misbelief within the market
1232
00:51:38.765 --> 00:51:41.525
that, you know, AI is just about tech.
1233
00:51:41.915 --> 00:51:44.205
Well, it's, it's, it's change management,
1234
00:51:44.275 --> 00:51:45.405
it's people, it's process.
1235
00:51:45.985 --> 00:51:47.845
And really it's, I think the companies
1236
00:51:47.845 --> 00:51:50.805
who are having success with AI are really thinking about,
1237
00:51:51.185 --> 00:51:52.325
you know, what's our vision?
1238
00:51:52.775 --> 00:51:54.925
Where are we now? Where do we want to be?
1239
00:51:55.305 --> 00:51:59.005
And how can AI enable that? You know, it's not siloed.
1240
00:51:59.195 --> 00:52:01.605
It's very much core to what they're trying
1241
00:52:01.605 --> 00:52:02.765
to achieve as an organization.
1242
00:52:02.865 --> 00:52:03.925
So really thinking about
1243
00:52:04.345 --> 00:52:06.845
how can we derive business value from ai?
1244
00:52:07.075 --> 00:52:09.085
What are those use cases to get started with?
1245
00:52:09.505 --> 00:52:12.205
And kind of building out their AI strategy around that.
1246
00:52:12.865 --> 00:52:17.505
Um, I think companies who are not making
1247
00:52:17.525 --> 00:52:20.745
as much progress, I think is probably very much
1248
00:52:20.745 --> 00:52:22.185
that analysis paralysis.
1249
00:52:22.185 --> 00:52:24.385
And I think we've had a quite a lot of conversations
1250
00:52:24.385 --> 00:52:26.465
around this in terms of, you know,
1251
00:52:26.815 --> 00:52:28.385
wanting everything to be perfect.
1252
00:52:29.245 --> 00:52:31.785
Um, whereas we see that really to have traction, you need
1253
00:52:31.785 --> 00:52:34.225
to kind of get some of this stuff going in parallel
1254
00:52:34.225 --> 00:52:37.705
and actually AI's a great hook to get your data in order,
1255
00:52:38.045 --> 00:52:40.745
you know, harden those landing zones and all that goodness.
1256
00:52:41.325 --> 00:52:43.985
Um, so I think, yeah, analysis paralysis
1257
00:52:43.985 --> 00:52:46.705
and also, I mean, we have some conversations with clients
1258
00:52:46.725 --> 00:52:49.065
who want to see experience
1259
00:52:49.065 --> 00:52:51.865
or examples of where have you done this for x
1260
00:52:51.865 --> 00:52:55.105
that looks exactly like our business, where a lot
1261
00:52:55.105 --> 00:52:57.185
of the time it kind of doesn't exist still.
1262
00:52:57.215 --> 00:53:00.065
It's still early days in the gen AI market really.
1263
00:53:00.455 --> 00:53:01.785
Yeah. That's, uh, that sort
1264
00:53:01.785 --> 00:53:03.825
of first mover advantage, last mover risk.
1265
00:53:03.825 --> 00:53:07.185
Right. Uh, impossible. Impossible outcome. Exactly.
1266
00:53:07.605 --> 00:53:09.985
Um, I suppose, what's the thing right now
1267
00:53:10.005 --> 00:53:12.385
that's getting you most excited in the AI industry?
1268
00:53:12.805 --> 00:53:15.145
Um, you know, anything that, uh,
1269
00:53:15.245 --> 00:53:17.425
you think is gonna make a significant impact this year?
1270
00:53:17.725 --> 00:53:19.505
Or perhaps just your insights on
1271
00:53:19.505 --> 00:53:21.185
where you think the industry might be in 12 months time?
1272
00:53:22.045 --> 00:53:23.425
Uh, I don't, I didn't think we'd get
1273
00:53:23.425 --> 00:53:25.900
through this podcast without talking about about AI agents.
1274
00:53:26.385 --> 00:53:28.485
Uh, but yeah, I think, uh, we hear the,
1275
00:53:28.585 --> 00:53:29.685
and it is a buzzword, right?
1276
00:53:29.685 --> 00:53:31.285
We are hearing the buzzword around agents,
1277
00:53:31.285 --> 00:53:33.045
but I think really being able
1278
00:53:33.045 --> 00:53:35.085
to reimagine business processes
1279
00:53:35.505 --> 00:53:38.485
and having, you know, AI that works on behalf of us
1280
00:53:38.945 --> 00:53:40.565
and kind of automating, like it's,
1281
00:53:40.565 --> 00:53:42.605
it's RPA reimagined really, isn't it?
1282
00:53:42.985 --> 00:53:45.565
Um, but I think, yeah, agents is really exciting.
1283
00:53:45.625 --> 00:53:47.845
And I specifically think in
1284
00:53:48.485 --> 00:53:52.885
scenarios like IT help desks, call centers, um,
1285
00:53:53.275 --> 00:53:56.285
yeah, there's so much in terms of not just help me tap into
1286
00:53:56.285 --> 00:53:57.645
that kind of knowledge repository
1287
00:53:57.705 --> 00:54:00.485
so you can surface the right answer at the right time.
1288
00:54:01.025 --> 00:54:03.245
But also if you think about just making sure
1289
00:54:03.245 --> 00:54:05.405
that everyone in that contact center's providing a
1290
00:54:05.405 --> 00:54:09.005
consistent experience helping to onboard ensure quality kind
1291
00:54:09.005 --> 00:54:10.565
of assurance across the board.
1292
00:54:10.955 --> 00:54:14.245
There's so many use cases from a contact center perspective
1293
00:54:14.245 --> 00:54:16.885
that if you unlock one, you kind
1294
00:54:16.885 --> 00:54:18.005
of keep seeing the value, right?
1295
00:54:18.005 --> 00:54:19.925
Because yeah, they might go up in complexity,
1296
00:54:20.025 --> 00:54:23.645
but once you get started, options are kind of endless.
1297
00:54:24.115 --> 00:54:26.445
Yeah. So Simon that we had on earlier was talking about,
1298
00:54:26.545 --> 00:54:29.325
um, you know, choosing use cases that are, you know,
1299
00:54:29.325 --> 00:54:30.765
easily replicated Yes.
1300
00:54:30.765 --> 00:54:32.445
Across the organizations that's, uh, you know,
1301
00:54:32.445 --> 00:54:33.565
a key takeaway that we had.
1302
00:54:34.065 --> 00:54:36.245
Um, I suppose going back to the point more broadly around,
1303
00:54:36.585 --> 00:54:37.965
um, you know, organizations
1304
00:54:38.325 --> 00:54:39.405
adopting this sort of technology.
1305
00:54:39.405 --> 00:54:42.565
Mm-hmm. What changes do you think an organization needs,
1306
00:54:42.715 --> 00:54:45.925
need, need to make, um, to adopt AI successfully?
1307
00:54:45.925 --> 00:54:47.005
And I suppose more importantly,
1308
00:54:47.145 --> 00:54:48.525
why should they be excited about it?
1309
00:54:48.525 --> 00:54:49.765
And I know that's gonna differ from
1310
00:54:49.765 --> 00:54:50.965
an employee perspective Yeah.
1311
00:54:50.965 --> 00:54:52.045
Versus a leadership perspective.
1312
00:54:52.625 --> 00:54:55.805
Um, you know, what are the reasons why say a leader may want
1313
00:54:55.805 --> 00:54:58.325
to adopt AI and understanding that the reasons
1314
00:54:58.325 --> 00:55:00.645
that an an employee may adopt them are very different,
1315
00:55:00.645 --> 00:55:01.765
and how do they bridge that gap?
1316
00:55:02.035 --> 00:55:03.645
Yeah. Okay. We're gonna have to break this down
1317
00:55:03.845 --> 00:55:05.325
'cause I've forgotten some of what you've said already.
1318
00:55:05.885 --> 00:55:09.045
But I think why you should be excited around ai.
1319
00:55:09.285 --> 00:55:12.325
I think there's no ignoring it. It's not going anywhere.
1320
00:55:12.705 --> 00:55:14.165
And I think what's a famous quote,
1321
00:55:15.035 --> 00:55:17.165
AI's not gonna replace you, but someone that learns
1322
00:55:17.165 --> 00:55:20.005
and knows how to use AI is potentially gonna replace you.
1323
00:55:20.065 --> 00:55:23.205
So I think really it's building out your AI literacy
1324
00:55:23.305 --> 00:55:25.925
to understand what you, what AI is capable of,
1325
00:55:26.275 --> 00:55:27.565
what it isn't capable of,
1326
00:55:27.785 --> 00:55:29.845
and actually how you can delegate effectively
1327
00:55:29.845 --> 00:55:31.285
to AI is something
1328
00:55:31.285 --> 00:55:33.405
that everyone should be investing the time into learning.
1329
00:55:34.285 --> 00:55:35.565
I think, you know,
1330
00:55:36.225 --> 00:55:39.645
it adds value in various ways across an organization.
1331
00:55:39.745 --> 00:55:42.685
So actually even just taking our personal day-to-day lives,
1332
00:55:42.785 --> 00:55:46.605
if we think about now how we're using tools like chat, GBT,
1333
00:55:47.185 --> 00:55:51.005
you know, Gemini, other services are available, um, I think,
1334
00:55:51.145 --> 00:55:53.565
you know, I'm visiting a new location,
1335
00:55:54.075 --> 00:55:55.605
help me define my agenda,
1336
00:55:55.825 --> 00:55:57.885
or I've got these ingredients left in the
1337
00:55:57.885 --> 00:55:59.165
fridge, what can I make?
1338
00:55:59.285 --> 00:56:03.045
I think like a, AI's now become a bit of a commodity, right?
1339
00:56:03.045 --> 00:56:05.685
In terms of we're all using it as consumers,
1340
00:56:06.305 --> 00:56:09.085
so then we're expecting to be able to use it at work.
1341
00:56:09.625 --> 00:56:12.645
But as a customer, we're also expecting to be able to
1342
00:56:13.525 --> 00:56:15.485
reap the rewards of it through customer service.
1343
00:56:16.225 --> 00:56:20.325
So I think we always talk about kind of the four main areas
1344
00:56:20.425 --> 00:56:23.045
of transformation within an organization for ai.
1345
00:56:23.140 --> 00:56:25.125
And it really is that employee productivity
1346
00:56:25.585 --> 00:56:28.645
to just helping make my day-to-day working life easier,
1347
00:56:29.235 --> 00:56:32.045
take away those mundane, repeatable tasks
1348
00:56:32.045 --> 00:56:33.925
that no one enjoys, um,
1349
00:56:33.945 --> 00:56:36.845
and also be able to surface information more quickly,
1350
00:56:37.305 --> 00:56:39.005
create a first draft more quickly,
1351
00:56:39.835 --> 00:56:41.725
then there's customer experience
1352
00:56:42.065 --> 00:56:44.605
and just being able to provide, you know,
1353
00:56:44.605 --> 00:56:46.445
the best customer service possible.
1354
00:56:47.105 --> 00:56:48.525
Um, then it's all around
1355
00:56:48.525 --> 00:56:51.645
that re-imagining making business processes more efficient.
1356
00:56:51.915 --> 00:56:53.565
Like, just because something's existed
1357
00:56:53.665 --> 00:56:56.885
for 10 years doesn't mean that's the right way to do it now.
1358
00:56:57.065 --> 00:56:59.925
And if you could take away all of that red tape, you know,
1359
00:57:00.075 --> 00:57:01.245
what could that look like?
1360
00:57:01.785 --> 00:57:05.285
And then finally, I think leveraging AI for innovation r
1361
00:57:05.285 --> 00:57:08.405
and d, um, being able to take things to market more quickly,
1362
00:57:08.815 --> 00:57:11.005
empowering software engineers to, you know,
1363
00:57:11.005 --> 00:57:13.525
write code more quickly and efficiently.
1364
00:57:13.645 --> 00:57:15.925
I think the options are really endless in terms
1365
00:57:15.925 --> 00:57:17.525
of why you should be excited about it.
1366
00:57:17.525 --> 00:57:19.045
And I think it's gonna be kind
1367
00:57:19.045 --> 00:57:21.805
of market defining for every industry.
1368
00:57:22.325 --> 00:57:24.085
Absolutely. So stepping back from that for a sec,
1369
00:57:24.245 --> 00:57:25.685
'cause that's, uh, that's a lot of stuff.
1370
00:57:26.465 --> 00:57:28.885
Um, and I suppose for a lot of people listening, it's,
1371
00:57:28.885 --> 00:57:30.485
it could be quite overwhelming Yeah.
1372
00:57:30.485 --> 00:57:33.165
To get started. So how should people go about
1373
00:57:33.315 --> 00:57:34.965
isolating good use cases?
1374
00:57:35.155 --> 00:57:37.485
When we've spoken to some of the guests earlier today,
1375
00:57:37.825 --> 00:57:39.765
you know, they've talked about some use cases being
1376
00:57:39.765 --> 00:57:41.005
obvious to their organization.
1377
00:57:41.005 --> 00:57:45.365
Yeah. Um, but also, um, that, that perhaps sometimes the,
1378
00:57:45.365 --> 00:57:47.525
the leadership lens on what's valuable versus
1379
00:57:47.585 --> 00:57:48.805
the user lens is very different.
1380
00:57:48.805 --> 00:57:50.485
Yeah. A hundred percent. So how can you arrive on things
1381
00:57:50.485 --> 00:57:51.965
that really everybody is happy about
1382
00:57:51.985 --> 00:57:53.765
and are gonna get behind most importantly?
1383
00:57:54.035 --> 00:57:55.085
Yeah, it's so true.
1384
00:57:55.145 --> 00:57:58.125
And I, I, I think AI's really overwhelming
1385
00:57:58.305 --> 00:58:01.005
and we always say like, start with the business outcomes.
1386
00:58:01.075 --> 00:58:02.245
What are you looking to achieve?
1387
00:58:02.265 --> 00:58:04.525
And then think about what the right solution is, right?
1388
00:58:04.525 --> 00:58:07.045
Tech comes after absolutely. What you're trying to achieve.
1389
00:58:07.345 --> 00:58:09.285
And one of the things we run here at Transparency,
1390
00:58:09.285 --> 00:58:11.565
which are some of my favorite days at work,
1391
00:58:11.985 --> 00:58:14.925
are supporting clients with design, design-led thinking
1392
00:58:14.925 --> 00:58:15.965
and visioning workshops
1393
00:58:16.185 --> 00:58:19.405
to help them look across the various departments within
1394
00:58:19.405 --> 00:58:22.245
their organization or their kind of end-to-end supply chain
1395
00:58:22.745 --> 00:58:25.205
and really pick out what those use cases are.
1396
00:58:25.785 --> 00:58:27.685
And I think the first thing
1397
00:58:27.685 --> 00:58:31.685
that highlights is the importance of cross-functional teams
1398
00:58:32.385 --> 00:58:35.725
and bringing people that understand where your data sits,
1399
00:58:35.725 --> 00:58:38.845
understands your tech, but also, you know, the SMEs
1400
00:58:38.845 --> 00:58:41.605
that are really living these day-to-day work,
1401
00:58:42.475 --> 00:58:44.365
work weekly workflows.
1402
00:58:45.545 --> 00:58:47.565
Um, and I think, so yeah, having those SMEs
1403
00:58:47.565 --> 00:58:51.165
that really understand the business processes paired
1404
00:58:51.165 --> 00:58:53.165
with the, you know, your technical teams
1405
00:58:53.265 --> 00:58:57.065
to know what's feasible, typically that will help you
1406
00:58:57.065 --> 00:59:00.265
to identify use cases that are gonna add business value,
1407
00:59:00.735 --> 00:59:02.185
that are also feasible,
1408
00:59:02.645 --> 00:59:05.865
but then will also help you build that roadmap of, okay,
1409
00:59:05.885 --> 00:59:07.345
here's some low hanging fruit
1410
00:59:07.485 --> 00:59:11.305
and easy use case to get started with, build some momentum
1411
00:59:11.815 --> 00:59:13.945
that then will take us into this kind of longer
1412
00:59:14.635 --> 00:59:16.185
three horizon roadmap.
1413
00:59:16.815 --> 00:59:17.825
It's interesting, we've run some
1414
00:59:17.825 --> 00:59:20.185
of those workshops together and I find them like you
1415
00:59:20.185 --> 00:59:21.245
really interesting.
1416
00:59:21.505 --> 00:59:23.845
Mostly because the most interesting use cases that tend
1417
00:59:23.845 --> 00:59:25.445
to come out are things that you would never even
1418
00:59:25.445 --> 00:59:26.525
thought of at the start of the day.
1419
00:59:26.525 --> 00:59:28.125
Yeah. And in, and in many instances things
1420
00:59:28.125 --> 00:59:31.245
that perhaps people in leadership even within the room don't
1421
00:59:31.245 --> 00:59:32.885
even know exist within their own organization.
1422
00:59:33.035 --> 00:59:35.005
Exactly. I think you get a lot of aha moments.
1423
00:59:35.005 --> 00:59:36.965
Yeah, right. In terms of, you've got those exec sponsors
1424
00:59:36.965 --> 00:59:38.725
there who are like, you do that yeah.
1425
00:59:38.745 --> 00:59:41.925
For two hours every week. Like, why are we not fixing that?
1426
00:59:42.105 --> 00:59:43.245
And so I think it does kind
1427
00:59:43.245 --> 00:59:46.205
of help make sure there is a level of realism
1428
00:59:46.265 --> 00:59:48.325
and reality to the AI strategy.
1429
00:59:48.645 --> 00:59:51.645
'cause you do want, you know, that overall purpose of
1430
00:59:51.645 --> 00:59:54.245
what you're working towards, but you also wanna kind
1431
00:59:54.245 --> 00:59:56.405
of build that momentum, bring everyone on the journey.
1432
00:59:57.265 --> 01:00:00.125
Um, and I think those envisioning workshops are a great
1433
01:00:00.125 --> 01:00:01.405
way, a great way of doing that.
1434
01:00:02.305 --> 01:00:05.485
How being a bit beyond those initial workshops, um,
1435
01:00:05.985 --> 01:00:07.325
how have you found, you know,
1436
01:00:07.325 --> 01:00:08.445
obviously working at transparency
1437
01:00:08.445 --> 01:00:10.925
and working with clients to deliver these solutions, um,
1438
01:00:10.945 --> 01:00:13.285
you know, how are we helping clients to sort
1439
01:00:13.285 --> 01:00:15.045
of rapidly bring these solutions to market?
1440
01:00:15.045 --> 01:00:16.005
You know, what, we talk a little
1441
01:00:16.005 --> 01:00:17.165
bit, I suppose, about our approach.
1442
01:00:17.835 --> 01:00:20.805
Yeah, I think we, um, and I do think it's changing.
1443
01:00:20.865 --> 01:00:22.565
So maybe we'll talk about this from two lenses.
1444
01:00:22.785 --> 01:00:26.965
I'd say for the last two years, I'd say, you know,
1445
01:00:26.965 --> 01:00:28.565
organizations have been really looking
1446
01:00:28.625 --> 01:00:30.285
to prove the value of ai.
1447
01:00:30.425 --> 01:00:32.005
So I think they've been started to get used.
1448
01:00:32.005 --> 01:00:34.845
They've, they started with just trying one
1449
01:00:34.845 --> 01:00:38.725
or two use cases proving the value of that to almost get
1450
01:00:38.725 --> 01:00:41.805
that kind of buy-in budget, to be able
1451
01:00:41.805 --> 01:00:44.565
to move into deployment, track the returns on
1452
01:00:44.565 --> 01:00:47.565
that measure monitor and kind of iterate.
1453
01:00:47.595 --> 01:00:51.085
Yeah. So I think we've seen, um, yeah, clients going from
1454
01:00:51.085 --> 01:00:53.285
that envisioning workshop, selecting
1455
01:00:53.345 --> 01:00:55.565
or prioritizing the use cases from that,
1456
01:00:56.035 --> 01:00:59.805
then moving into kind of scoping, so really designing, okay,
1457
01:00:59.815 --> 01:01:03.405
we've landed on this idea, what is the right solution
1458
01:01:03.405 --> 01:01:05.165
for us now, you know, is that low code?
1459
01:01:05.425 --> 01:01:08.405
Is that pro code, is that, is that buy, is it build?
1460
01:01:08.405 --> 01:01:11.085
Because everything has AI infused into it now.
1461
01:01:11.145 --> 01:01:13.445
So a lot of the products you're buying off the shelf will
1462
01:01:13.445 --> 01:01:14.645
have AI components.
1463
01:01:14.705 --> 01:01:17.005
So I think it's, yeah, landing on that use case,
1464
01:01:17.695 --> 01:01:18.925
being really clear about
1465
01:01:18.925 --> 01:01:20.805
how you're gonna measure the success of that.
1466
01:01:21.395 --> 01:01:23.925
Then designing what the solution looks like
1467
01:01:24.025 --> 01:01:25.045
for your organization,
1468
01:01:25.425 --> 01:01:27.325
and then getting started with a proof of value
1469
01:01:27.865 --> 01:01:29.965
and then moving into kind of deployment.
1470
01:01:30.025 --> 01:01:33.285
So typically we've been helping clients go on that journey.
1471
01:01:34.285 --> 01:01:37.805
I think now we're starting to see a bit of a change in terms
1472
01:01:37.825 --> 01:01:39.685
of, okay, we get it.
1473
01:01:39.865 --> 01:01:41.645
AI works don't really need
1474
01:01:41.645 --> 01:01:43.525
to prove the value as much anymore.
1475
01:01:43.985 --> 01:01:46.245
And I think it does depend a little bit based on
1476
01:01:46.245 --> 01:01:49.525
what the size of the organization in terms of the approach.
1477
01:01:49.555 --> 01:01:52.405
Sure. But we are very much seeing this move now to
1478
01:01:52.985 --> 01:01:55.765
how do we scale AI across our organization
1479
01:01:55.785 --> 01:01:58.085
and going from kind of use case to platform.
1480
01:01:58.625 --> 01:02:00.805
Um, and I think, yeah, we're seeing that and, and,
1481
01:02:00.945 --> 01:02:02.925
and it shows actually the maturity curve
1482
01:02:02.945 --> 01:02:04.005
is kind of shifting, right?
1483
01:02:04.005 --> 01:02:05.725
Because the conversation is changing.
1484
01:02:06.925 --> 01:02:08.245
Absolutely. And that's ultimately rooted in,
1485
01:02:08.245 --> 01:02:09.365
in change management, isn't it?
1486
01:02:09.365 --> 01:02:11.285
Yeah. You know, that ability to get technology into the
1487
01:02:11.285 --> 01:02:12.565
hands of, of actual people.
1488
01:02:13.345 --> 01:02:16.885
Um, I suppose, how do you think organizations should go
1489
01:02:16.885 --> 01:02:19.925
about selecting people to, to work on these, these problems
1490
01:02:19.925 --> 01:02:21.685
and, and bring AI into their organization?
1491
01:02:21.705 --> 01:02:22.925
You know, what's the, what's the trick?
1492
01:02:23.785 --> 01:02:25.805
Oh, I think it's no one size fits all.
1493
01:02:26.505 --> 01:02:30.245
So I think it does depend on size of the, the organization,
1494
01:02:30.905 --> 01:02:31.965
um, industry
1495
01:02:32.505 --> 01:02:36.325
and also kind of is, you know, if you are a product company
1496
01:02:36.545 --> 01:02:40.325
and AI's kind of gonna be the core differentiator for you,
1497
01:02:40.525 --> 01:02:42.725
I think, you know, it'll change the approach.
1498
01:02:42.825 --> 01:02:47.245
Mm-hmm. Um, but really I think we need well-rounded teams
1499
01:02:47.915 --> 01:02:50.645
that have individuals that are, you know, responsible for
1500
01:02:51.405 --> 01:02:52.645
strategy product.
1501
01:02:53.665 --> 01:02:56.925
Um, those SMEs, those domain experts
1502
01:02:56.945 --> 01:03:00.805
who really understand the processes of workflows with
1503
01:03:00.805 --> 01:03:02.005
that exec sponsorship.
1504
01:03:02.425 --> 01:03:04.205
And then the technical delivery teams.
1505
01:03:04.525 --> 01:03:06.205
I think, you know, really wanting
1506
01:03:06.205 --> 01:03:08.525
to have those well-rounded cross-functional teams.
1507
01:03:08.985 --> 01:03:10.365
And then we talk about this a lot,
1508
01:03:10.365 --> 01:03:12.005
but the importance of soft skills.
1509
01:03:12.605 --> 01:03:14.405
I think in the world of ai,
1510
01:03:14.775 --> 01:03:17.525
those soft skills are becoming more important than ever in
1511
01:03:17.525 --> 01:03:21.405
terms of collaboration, communication, um,
1512
01:03:22.235 --> 01:03:23.405
curiosity Yeah.
1513
01:03:23.505 --> 01:03:26.885
Is a huge one. Right. I'm sure you've got many more to add.
1514
01:03:27.235 --> 01:03:28.325
Yeah. We talk about, uh,
1515
01:03:28.325 --> 01:03:30.085
natural curiosity being super important.
1516
01:03:30.305 --> 01:03:31.925
Um, I think in all tech adoption,
1517
01:03:31.945 --> 01:03:35.125
but particularly within, um, AI as well as we,
1518
01:03:35.185 --> 01:03:37.085
as we use tools that we've never used before.
1519
01:03:37.225 --> 01:03:39.045
And, and we'll be using tools we've never used
1520
01:03:39.045 --> 01:03:41.325
before, pretty much day in, day out, forever now.
1521
01:03:41.825 --> 01:03:43.445
Um, it's interesting actually, you know, all
1522
01:03:43.445 --> 01:03:45.085
of our guests today have talked in some way, shape
1523
01:03:45.085 --> 01:03:47.805
or form around the concept of kind of soft skills,
1524
01:03:48.525 --> 01:03:51.085
lifelong learning, um, in many ways
1525
01:03:51.205 --> 01:03:52.805
because clearly the jobs
1526
01:03:52.805 --> 01:03:55.005
that we do right now are gonna change on a sort of month
1527
01:03:55.005 --> 01:03:56.965
by month basis rather than a, than a sort
1528
01:03:56.965 --> 01:03:58.205
of decade by decade basis.
1529
01:03:58.315 --> 01:04:00.925
Yeah. Um, is there anything that, you know, you've,
1530
01:04:00.925 --> 01:04:03.925
you've really learned over the last sort of 12 to 24 months
1531
01:04:03.985 --> 01:04:05.965
that's really helped you with your own AI adoption.
1532
01:04:05.965 --> 01:04:07.365
What has it been that's really
1533
01:04:07.365 --> 01:04:08.605
helped you to double down on that?
1534
01:04:10.115 --> 01:04:12.125
It's a full-time job to keep up with,
1535
01:04:12.285 --> 01:04:13.285
with AI at the moment.
1536
01:04:13.945 --> 01:04:17.685
Um, and I follow a lot of great people on LinkedIn, so just
1537
01:04:17.685 --> 01:04:20.245
that information gathering of what shameless
1538
01:04:20.525 --> 01:04:21.525
LinkedIn plug there. Yes.
1539
01:04:21.525 --> 01:04:23.725
Um, but I think the biggest learning
1540
01:04:23.725 --> 01:04:28.565
for me has probably been around, uh, skills and literacy,
1541
01:04:28.565 --> 01:04:31.365
because I think I probably take for granted that
1542
01:04:31.965 --> 01:04:35.445
I come from a generation that, you know, has used apps
1543
01:04:35.585 --> 01:04:36.805
and, and social media.
1544
01:04:36.985 --> 01:04:39.045
And whereas I think if I think about, you know, my dad
1545
01:04:39.065 --> 01:04:40.525
and how he uses technology,
1546
01:04:40.965 --> 01:04:43.725
I think there's still quite a long way to go in terms
1547
01:04:43.725 --> 01:04:47.165
of not only AI literacy, but digital literacy.
1548
01:04:47.195 --> 01:04:49.005
Yeah. And that adoption
1549
01:04:49.005 --> 01:04:50.325
and change management is
1550
01:04:50.825 --> 01:04:53.445
so important when you think about your AI strategy
1551
01:04:53.445 --> 01:04:57.805
because you can create and deploy solutions all day long,
1552
01:04:58.225 --> 01:04:59.805
but you need people to use them.
1553
01:04:59.805 --> 01:05:02.405
Yeah. Right. Um, so for me it's been that kind of,
1554
01:05:02.755 --> 01:05:04.685
this is not a one and done approach.
1555
01:05:05.155 --> 01:05:08.085
It's that reinforcement learning that I think
1556
01:05:08.665 --> 01:05:10.805
as we've been working with clients that's really come
1557
01:05:10.805 --> 01:05:13.165
to light how important that that really is.
1558
01:05:13.545 --> 01:05:17.645
Um, and then personally on my AI journey, I've been trying
1559
01:05:17.665 --> 01:05:19.485
to get a little bit more hands on in terms
1560
01:05:19.485 --> 01:05:20.685
of building agents.
1561
01:05:20.685 --> 01:05:24.725
Yeah. Um, I think now, like I said, there's
1562
01:05:24.725 --> 01:05:25.725
so much tech available,
1563
01:05:26.425 --> 01:05:27.685
but there is a lot
1564
01:05:27.685 --> 01:05:31.525
of this low no code options now available for ai.
1565
01:05:31.585 --> 01:05:33.845
So you know that Microsoft is saying
1566
01:05:33.845 --> 01:05:36.525
that everyone's gonna become a maker of agents.
1567
01:05:36.555 --> 01:05:37.845
Yeah. So I've been trying
1568
01:05:37.845 --> 01:05:39.685
to get a little bit hands-on again. Yeah.
1569
01:05:40.115 --> 01:05:41.845
Yeah. And we, and we talked and we see that in a lot
1570
01:05:41.845 --> 01:05:43.005
of the workshops that we run as well, is
1571
01:05:43.005 --> 01:05:45.285
that the best person to solve their problem is the person
1572
01:05:45.285 --> 01:05:47.245
that knows their problem, uh, inside outright.
1573
01:05:47.345 --> 01:05:48.405
So that makes perfect sense.
1574
01:05:49.105 --> 01:05:53.085
Um, I suppose moving into the future mm-hmm.
1575
01:05:53.165 --> 01:05:55.845
What's the thing that excites you most about
1576
01:05:55.895 --> 01:05:57.645
where AI is going right now?
1577
01:05:58.835 --> 01:06:03.115
That's a good question. Um, I think it's exciting
1578
01:06:03.115 --> 01:06:04.915
that we don't know what we don't know.
1579
01:06:05.105 --> 01:06:07.275
Like if we would've sat here two years ago,
1580
01:06:08.325 --> 01:06:10.465
my role didn't really exist, to be honest.
1581
01:06:11.165 --> 01:06:14.505
Um, so I think yeah, we don't know where things are heading.
1582
01:06:15.105 --> 01:06:18.745
I do think this kind of multi-modality is quite exciting.
1583
01:06:18.935 --> 01:06:20.585
Yeah. And I think the use
1584
01:06:20.585 --> 01:06:24.145
of voice is gonna become more important than ever before.
1585
01:06:24.145 --> 01:06:28.105
Yeah. Um, if you think about co-pilot at the moment in terms
1586
01:06:28.105 --> 01:06:32.465
of having that AI personal assistant, we are typing prompts
1587
01:06:33.085 --> 01:06:35.025
and some people have started to kind of dictate
1588
01:06:35.045 --> 01:06:37.505
and start talking, but I think it's gonna become more
1589
01:06:37.505 --> 01:06:41.665
natural for us to walk into our office and say, Hey, copilot
1590
01:06:41.685 --> 01:06:44.025
or equivalent solution, um, you know,
1591
01:06:44.245 --> 01:06:45.385
what's ahead of me today?
1592
01:06:45.385 --> 01:06:46.425
What should I prioritize?
1593
01:06:47.145 --> 01:06:48.625
I think these digital assistants,
1594
01:06:48.745 --> 01:06:49.945
I think we're gonna just get Yeah.
1595
01:06:50.375 --> 01:06:53.455
They're gonna become more integrated as part of our lives.
1596
01:06:54.395 --> 01:06:56.455
And I suppose in, in terms of, um, you know,
1597
01:06:56.555 --> 01:06:58.175
how quickly people are adopting things,
1598
01:06:58.195 --> 01:07:00.015
are there any particular sort of industries
1599
01:07:00.015 --> 01:07:02.615
that you're seeing streak ahead in this area?
1600
01:07:02.835 --> 01:07:04.175
Any, any people that are really getting
1601
01:07:04.175 --> 01:07:06.055
that competitive advantage early on?
1602
01:07:09.475 --> 01:07:13.285
Yeah, I think it goes back to that maturity curve
1603
01:07:13.345 --> 01:07:14.805
of the organizations
1604
01:07:14.805 --> 01:07:17.285
that have been leveraging machine learning and data science
1605
01:07:17.465 --> 01:07:21.175
or, you know, technology for decades
1606
01:07:21.405 --> 01:07:24.455
that they're well set up, you know, in terms of they've got
1607
01:07:24.455 --> 01:07:27.015
that culture already instilled that growth mindset
1608
01:07:27.115 --> 01:07:29.975
of let's try fail fast.
1609
01:07:30.125 --> 01:07:31.695
Yeah. Yeah. You know, iterate.
1610
01:07:32.235 --> 01:07:36.975
Um, I think retail is always quite far ahead in terms of,
1611
01:07:36.975 --> 01:07:41.335
again, the consumer expectations means they have to be.
1612
01:07:41.365 --> 01:07:45.305
Yeah. Um, media, uh, is also quite far ahead.
1613
01:07:45.405 --> 01:07:48.265
But then I, I think actually we are seeing, um,
1614
01:07:48.485 --> 01:07:50.465
and you would've seen this from your conversations earlier
1615
01:07:50.555 --> 01:07:51.585
today with BMT,
1616
01:07:51.645 --> 01:07:54.945
but we are seeing that highly regulated industries
1617
01:07:55.565 --> 01:07:57.585
are also really embracing this technology.
1618
01:07:57.585 --> 01:08:00.145
Yeah, absolutely. So there is a way of doing it in a secure,
1619
01:08:00.605 --> 01:08:02.905
you know, compliant, well-governed manner.
1620
01:08:03.245 --> 01:08:04.305
So I think we are seeing a lot
1621
01:08:04.305 --> 01:08:08.185
of progress in financial services, um, in public sector.
1622
01:08:09.445 --> 01:08:11.865
So yeah, most industries really,
1623
01:08:12.365 --> 01:08:14.385
And we, we saw from speaking to some of the guys earlier
1624
01:08:14.445 --> 01:08:18.525
as well, that, um, it really feels like the big barrier.
1625
01:08:19.105 --> 01:08:22.125
Um, or I suppose the, the opportunity for success is really
1626
01:08:22.125 --> 01:08:24.165
around, uh, the culture of the organization.
1627
01:08:24.165 --> 01:08:25.885
Yes. The ability, so you talked around the ability
1628
01:08:25.885 --> 01:08:29.565
to fail fast and rapidly, iterate, um, everybody that's,
1629
01:08:29.565 --> 01:08:31.725
that's talked about success today has talked about being
1630
01:08:31.795 --> 01:08:34.005
able to have trusting leadership.
1631
01:08:34.195 --> 01:08:35.885
Yeah. Um, I suppose you probably talk
1632
01:08:35.885 --> 01:08:37.085
to organizations all the time that are,
1633
01:08:37.085 --> 01:08:39.205
that are all across the gamut there in
1634
01:08:39.205 --> 01:08:40.285
terms of their approach.
1635
01:08:40.345 --> 01:08:43.085
Mm-hmm. Um, because our listeners are likely
1636
01:08:43.085 --> 01:08:45.165
to be technology and business leaders, you know,
1637
01:08:45.475 --> 01:08:46.965
what can they do if their
1638
01:08:46.965 --> 01:08:48.365
organization doesn't already have that culture?
1639
01:08:48.555 --> 01:08:51.005
What can they do to start trying to shift in that direction?
1640
01:08:51.925 --> 01:08:53.045
I think there's a few things.
1641
01:08:53.185 --> 01:08:55.965
So I think AI policy is really important
1642
01:08:56.425 --> 01:08:59.845
and I think putting those guard rails around, yeah, this is
1643
01:08:59.845 --> 01:09:01.005
what you're okay to do
1644
01:09:01.225 --> 01:09:03.605
and this is what you're not okay to do as an employee.
1645
01:09:04.145 --> 01:09:06.005
Um, because I think, you know, we all wanna know.
1646
01:09:06.195 --> 01:09:07.605
Yeah. Safe space. Yeah, exactly.
1647
01:09:08.145 --> 01:09:11.045
Um, so I think that AI policy piece in terms of
1648
01:09:11.925 --> 01:09:13.105
not only the guardrails,
1649
01:09:13.105 --> 01:09:15.025
but actually if you do want to do something,
1650
01:09:15.045 --> 01:09:17.985
what's the process like, who do I reach out to?
1651
01:09:18.365 --> 01:09:21.385
Making sure that's kind of well documented and communicated.
1652
01:09:21.965 --> 01:09:25.545
And then I think just everything with AI needs to have
1653
01:09:25.545 --> 01:09:27.345
that transparency around it.
1654
01:09:27.445 --> 01:09:30.305
So responsible AI principles are really important.
1655
01:09:31.005 --> 01:09:34.025
Um, and then I also think from a leadership perspective,
1656
01:09:34.655 --> 01:09:36.825
this kind of show not tell.
1657
01:09:37.445 --> 01:09:39.385
Um, so we see a lot of organizations,
1658
01:09:39.385 --> 01:09:42.785
their leadership is generating excitement as part
1659
01:09:42.785 --> 01:09:44.905
of their all hands and their kickoffs
1660
01:09:44.905 --> 01:09:47.625
and kind of showing how they're using it to try
1661
01:09:47.625 --> 01:09:50.185
and foster that, that culture of innovation.
1662
01:09:50.565 --> 01:09:53.625
Um, so yeah, I think it's a mixture of that kind of policy
1663
01:09:53.805 --> 01:09:58.205
and responsible AI and being really transparent, um, and,
1664
01:09:58.225 --> 01:10:00.365
and, and making people feel safe,
1665
01:10:00.515 --> 01:10:02.165
like psychological safety, right?
1666
01:10:02.165 --> 01:10:03.765
Like, this isn't gonna replace your jobs,
1667
01:10:04.355 --> 01:10:08.365
this is gonna add value, um, this is gonna enable you
1668
01:10:08.365 --> 01:10:10.405
to do those things that you really wanna get to,
1669
01:10:10.425 --> 01:10:12.405
but you're always doing this, you know,
1670
01:10:12.405 --> 01:10:13.885
this low value task over here.
1671
01:10:14.585 --> 01:10:16.765
But then also, yeah, I think that having
1672
01:10:16.765 --> 01:10:19.125
that leadership vision is really, really important.
1673
01:10:20.305 --> 01:10:22.405
And what are you excited about from a transparency
1674
01:10:22.405 --> 01:10:24.325
perspective over the coming 12 months?
1675
01:10:24.665 --> 01:10:27.205
Um, you know, what it is that we'll be bringing to market,
1676
01:10:27.945 --> 01:10:30.485
um, things that you are particularly interested in,
1677
01:10:30.705 --> 01:10:33.445
or even just things, if you could do whatever you wanted
1678
01:10:33.685 --> 01:10:34.645
tomorrow, like what would you do
1679
01:10:34.645 --> 01:10:35.765
in the space to differentiate?
1680
01:10:36.515 --> 01:10:39.085
There's too much, there's not enough time in the day.
1681
01:10:39.465 --> 01:10:43.665
Um, I think
1682
01:10:44.765 --> 01:10:47.345
we are still learning and growing in this space.
1683
01:10:47.545 --> 01:10:50.825
I think it's really cool at transparency that we have
1684
01:10:50.825 --> 01:10:53.545
that breadth of technical capability.
1685
01:10:53.885 --> 01:10:56.585
So like I said earlier, it doesn't matter
1686
01:10:56.615 --> 01:10:57.745
what the solution is,
1687
01:10:58.135 --> 01:10:59.745
what problem are you trying to solve for?
1688
01:10:59.815 --> 01:11:01.865
Yeah. And then we can help you with the solution.
1689
01:11:02.445 --> 01:11:07.305
But also I think as we are moving away from pilot use case
1690
01:11:07.485 --> 01:11:10.585
to platform, you know, releasing ai,
1691
01:11:11.405 --> 01:11:14.885
operationalizing AI in a standardized way across your
1692
01:11:14.885 --> 01:11:18.165
organization, you are gonna need to work with partners
1693
01:11:18.545 --> 01:11:21.285
who are capable than much more than just ai.
1694
01:11:21.665 --> 01:11:23.845
And what really excites me
1695
01:11:23.845 --> 01:11:27.645
and what I think sets us apart is that we can support
1696
01:11:27.675 --> 01:11:30.485
with your landing zones, the security
1697
01:11:30.545 --> 01:11:32.925
and governance, identifying the use cases,
1698
01:11:33.725 --> 01:11:35.605
building out an AI platform for scale.
1699
01:11:36.285 --> 01:11:38.245
I think it's, you know, it's not just ai,
1700
01:11:38.275 --> 01:11:41.405
it's your well architected data framework, right?
1701
01:11:41.505 --> 01:11:44.765
We can help you wherever you are on the journey.
1702
01:11:45.305 --> 01:11:46.845
Um, so I think that's really exciting.
1703
01:11:46.845 --> 01:11:48.885
Like the conversations we're having with clients around,
1704
01:11:49.235 --> 01:11:51.645
this is where you are now, this is where you wanna get to,
1705
01:11:51.665 --> 01:11:54.765
and this is how we are gonna go together on that journey,
1706
01:11:55.525 --> 01:11:56.525
I think is really exciting.
1707
01:11:57.205 --> 01:12:00.405
Absolutely. And I suppose, um, one of my final points,
1708
01:12:00.425 --> 01:12:04.125
and we, we spoke with, uh, Henrique earlier on today around,
1709
01:12:04.405 --> 01:12:06.205
I suppose the skills gap really around ai.
1710
01:12:06.225 --> 01:12:09.565
And we talked about things like government legislation, um,
1711
01:12:09.635 --> 01:12:11.605
what Microsoft are doing, um,
1712
01:12:11.625 --> 01:12:14.565
but particularly for yourself, um, as a woman in technology
1713
01:12:14.665 --> 01:12:16.405
and you spent time at Microsoft and at Google
1714
01:12:16.405 --> 01:12:19.165
before joining Transparency, you know, what would you say
1715
01:12:19.165 --> 01:12:21.045
to people that are looking to start out on this journey now,
1716
01:12:21.045 --> 01:12:22.925
knowing that many of the jobs that they might want
1717
01:12:22.925 --> 01:12:24.805
to do within AI don't exist yet?
1718
01:12:24.915 --> 01:12:28.085
Yeah. Um, perhaps their career journey isn't gonna look the
1719
01:12:28.085 --> 01:12:30.205
same as yours, but are there any tips that you could give
1720
01:12:30.205 --> 01:12:32.045
to people that are interested in AI right now?
1721
01:12:32.425 --> 01:12:34.045
Um, whether they be new
1722
01:12:34.065 --> 01:12:35.885
or a leader that hasn't touched it before?
1723
01:12:35.995 --> 01:12:37.445
Yeah, I think when you look at the stats
1724
01:12:37.445 --> 01:12:40.005
of women in technology, it's shocking enough.
1725
01:12:40.005 --> 01:12:42.165
But then when you look at women in ai, yeah,
1726
01:12:42.165 --> 01:12:43.365
it's pretty, pretty poor.
1727
01:12:43.705 --> 01:12:48.005
Um, I think going back to those soft skills, being curious,
1728
01:12:48.745 --> 01:12:51.445
um, and just learning around, yeah, what AI is
1729
01:12:51.445 --> 01:12:54.085
and what AI isn't, I think taking advantage
1730
01:12:54.185 --> 01:12:56.725
of the opportunities that are out there in terms
1731
01:12:56.745 --> 01:13:00.045
of meetup groups, you know, the likes of Microsoft
1732
01:13:00.045 --> 01:13:03.525
that provide a lot of free training, free workshops
1733
01:13:03.525 --> 01:13:04.925
to be able to upskill yourself,
1734
01:13:05.425 --> 01:13:09.605
but also to meet a network of like-minded individuals,
1735
01:13:09.605 --> 01:13:11.365
because a lot of your opportunities will
1736
01:13:11.365 --> 01:13:12.765
come through your network.
1737
01:13:13.505 --> 01:13:15.085
Um, so I think stay curious.
1738
01:13:15.865 --> 01:13:19.165
Um, there's a lot of free AI tools, you know,
1739
01:13:19.165 --> 01:13:20.285
they're not paid for.
1740
01:13:21.285 --> 01:13:22.375
Just play around with it
1741
01:13:22.635 --> 01:13:25.015
and see, you know, see what it can do and,
1742
01:13:25.035 --> 01:13:27.175
and your skills will develop naturally.
1743
01:13:27.335 --> 01:13:28.975
I think learning how to prompt
1744
01:13:29.675 --> 01:13:31.615
is a really important skill for the future.
1745
01:13:32.275 --> 01:13:34.815
Um, and there's, there's tons of content online.
1746
01:13:34.965 --> 01:13:37.895
It's just immersing yourself in that and, and being curious.
1747
01:13:38.685 --> 01:13:42.815
Awesome. And I suppose for any leaders listening now,
1748
01:13:43.315 --> 01:13:46.335
if there's one tip that you would give them, uh,
1749
01:13:46.355 --> 01:13:48.135
to get started to have a good framework.
1750
01:13:48.155 --> 01:13:49.895
You know, we talked on a few different topics today,
1751
01:13:50.085 --> 01:13:52.015
like making sure that we understand policy,
1752
01:13:52.715 --> 01:13:55.055
but if somebody's going, look, I just want
1753
01:13:55.115 --> 01:13:56.975
to deliver an AI solution in my organization.
1754
01:13:57.635 --> 01:13:58.895
How can they quickly get going?
1755
01:14:00.175 --> 01:14:01.775
I think setting up an AI steering group,
1756
01:14:02.455 --> 01:14:04.655
bringing together those different voices from across your
1757
01:14:04.655 --> 01:14:06.535
organization to bring people on that journey.
1758
01:14:07.075 --> 01:14:09.295
And then I think use cases king.
1759
01:14:09.435 --> 01:14:11.725
So starting with those envisioning, bringing
1760
01:14:11.725 --> 01:14:13.525
that AI steering group together, starting
1761
01:14:13.525 --> 01:14:16.405
with those envisioning workshops, landing on a use case,
1762
01:14:16.875 --> 01:14:20.285
proving some really quick early business value,
1763
01:14:21.065 --> 01:14:24.045
and then, you know, take that momentum to build
1764
01:14:24.045 --> 01:14:25.405
that into a wider AI strategy.
1765
01:14:25.665 --> 01:14:28.285
But I also think it doesn't need to be siloed, right?
1766
01:14:28.675 --> 01:14:31.005
This should be part of your wider business strategy.
1767
01:14:31.425 --> 01:14:34.565
You don't need to spend time creating separate strategies.
1768
01:14:35.365 --> 01:14:36.845
Absolutely. J thanks so much
1769
01:14:36.845 --> 01:14:38.205
for your time and insights today.
1770
01:14:38.205 --> 01:14:40.365
Really lovely to speak to you. As always. Thank
1771
01:14:40.365 --> 01:14:41.285
You for having me. It's been a
1772
01:14:41.405 --> 01:14:42.405
Pleasure. Thanks so much for tuning
1773
01:14:42.405 --> 01:14:43.085
into our initial
1774
01:14:43.085 --> 01:14:44.365
episode of the Transform Podcast.
1775
01:14:44.825 --> 01:14:46.925
If you want to get started on your AI journey
1776
01:14:47.185 --> 01:14:49.725
and answer those questions around how you can just do ai,
1777
01:14:49.835 --> 01:14:52.485
there's lots of resources on our website, um,
1778
01:14:52.545 --> 01:14:54.365
and lots of free articles you can download.
1779
01:14:54.625 --> 01:14:57.605
Um, and also if you're looking for, uh, resources
1780
01:14:57.665 --> 01:15:00.245
to learn skills, then head over to Microsoft Learn.
1781
01:15:00.245 --> 01:15:02.565
There's lots of stuff to get started on your AI journey.