AI Proving Ground Podcast

Enterprise AI Adoption: How Workforce AI Is Changing the Future of Work

World Wide Technology

Organizations are racing to unlock the potential of workforce AI, but only a few have cracked adoption at scale. In this episode of the AI Proving Ground Podcast, VP of Digital Experiences Joe Berger and workforce AI expert Kait Miller break down why AI pilots stall, how to move from pockets of success to enterprise-wide impact and why employees often trust consumer AI more than company-approved tools. Kait and Joe explore the hidden gap between employee adoption and enterprise rollout, address the build vs. buy conundrum and walk through a real-world case study at WWT that slashed days of work into hours and generated new revenue.

Support for this episode provided by: Omnissa

More about this week's guests: 

Joe Berger is Vice President of WWT's Digital Experience Practice, with over 20 years of expertise in end-user computing, collaboration and AI-driven workforce solutions. He has partnered with Microsoft and Cisco on innovative initiatives and frequently speaks at client advisory boards. Passionate about simplifying experiences to drive productivity, Joe also contributes to outlets like Forrester Research, CIO.com, Wired and CRN.

Joe's top pick: How Forward-Thinking Leaders Integrate Culture and Tech to Amplify Organizational Success

Kait Miller specializes in Workforce AI at WWT. She aims to identify growth opportunities and partnerships in this field, leveraging WWT's capabilities around worker productivity solutions. These tools enhance productivity, efficiency and innovation by integrating AI into everyday business processes.

Kait's top pick: Digital Trends Spotlight: Key Insights on Workforce Productivity Tools

The AI Proving Ground Podcast leverages the deep AI technical and business expertise from within World Wide Technology's one-of-a-kind AI Proving Ground, which provides unrivaled access to the world's leading AI technologies. This unique lab environment accelerates your ability to learn about, test, train and implement AI solutions.

Learn more about WWT's AI Proving Ground.

The AI Proving Ground is a composable lab environment that features the latest high-performance infrastructure and reference architectures from the world's leading AI companies, such as NVIDIA, Cisco, Dell, F5, AMD, Intel and others.

Developed within our Advanced Technology Center (ATC), this one-of-a-kind lab environment empowers IT teams to evaluate and test AI infrastructure, software and solutions for efficacy, scalability and flexibility — all under one roof. The AI Proving Ground provides visibility into data flows across the entire development pipeline, enabling more informed decision-making while safeguarding production environments.

Speaker 1:

From Worldwide Technology. This is the AI Proving Ground podcast. Today, across the enterprise world, generative AI is everywhere in pilots, proofs of concept and small departmental rollouts. But turning those experiments into company-wide adoption is proving far more difficult. Many initiatives never make it out of the pilot stage, leaving organizations with scattered pockets of success instead of a unified transformation. It's not for a lack of interest. Many employees are already using consumer AI tools on their own things like ChatGPT, declaude, etc. But often outside of official company channels. Inside the enterprise, the barriers are different Unclear use cases, security concerns, data readiness and too many overlapping tools.

Speaker 1:

In this episode, we're talking with two workforce AI experts, joe Berger and Kate Miller. Joe and Kate will dive into the rapidly expanding landscape of co-pilots, coding assistants, enterprise search platforms and other tools designed to transform everyday work. They'll look at why scaling them is so hard and the practical steps leaders can take to move from isolated success to enterprise-wide impact. Over the course of this episode, you'll hear how leading organizations are cutting through that noise, prioritizing the right AI use cases and building the governance and skills to turn promising pilots into enterprise-wide results. So let's get to it. Kate, thanks for joining today. How are you?

Speaker 2:

I'm doing great. How are you doing?

Speaker 1:

I am doing excellent. And, joe, how are you Welcome to the show?

Speaker 3:

I'm great. Thanks for finally getting us on here. I've been trying to get on this thing for a while. I'm excited. Only took us 25, 30 episodes in. We'll make getting us on here. I've been trying to get on this thing for a while.

Speaker 1:

I'm excited. Only took us 25, 30 episodes in. We'll make it worth your while.

Speaker 3:

Big fan of the podcast.

Speaker 1:

No, absolutely. We are talking today about workforce AI, all the tools and such that employees across all organizations might be using. I do want to start, joe. Ai definitely doesn't seem like it's any type of if statement anymore, and it's not even a why statement anymore. It's pretty much a how statement. But I was reading I'm going to read this here a 2025 S&P Global report showed that only 27% of enterprises using generative AI have actually achieved organization-wide adoption, and further a third of those initiatives are still stuck in the department project level or like a POC phase. Joe, I'm wondering why is it proving so hard to move from these pockets of success to enterprise-wide adoption?

Speaker 3:

Yeah, we actually see this come up quite a bit in a lot of our customer conversations and I think it is pretty widespread. I think everyone thought, hey, and this is a traditional problem I think every technology faces, whether it was internet, mobile cloud, you name it. Over the years, this new thing comes out and we're like, oh, there's all these great use cases for it, everyone's going to go run to it and adopt it, and what happens is IT might give it out to the end users or to a specific audience and they don't have the quite the right use case or reason to go use it right away, and so it just kind of sits there. I think what we sometimes see and, honestly, we get this request quite a lot.

Speaker 3:

You really have to have a strong adoption program in mind as you create these types of initiatives, especially with something that's so new like generative AI and AI for the workforce. You have to understand why an employee would want to go use these tools, how it's going to make their life better, how it's going to augment their typical day of what they actually do and really help them understand what the improvements are going to be to their day-to-day, why they need to start adopting some of these newer technologies. I know, kate, I know you talked to a lot of customers about this exact topic. What are you seeing?

Speaker 2:

Yeah, I think that that's absolutely critical. Critical the adoption piece here for all of these workforce AI solutions, employee productivity tools, putting a tool into a user's hands and just expecting them to intuitively know how to use it, especially after some of the stories that we heard over the last couple of years where employees are sort of inadvertently dropping company IP into some of these public-facing tools. Then we roll something like, you know, copilot or Glean or one of these other tools out without any direction and we say, yeah, this one you're allowed to use. And they're like are we really or am I going to accidentally, you know, expose customer IP? Am I going to expose my own company's IP, anything along those lines. So, adoption, not only to kind of show them how to use it, but to give them the confidence that they can use it and it is a secured platform for them to go ahead and engage in with every aspect of their business.

Speaker 2:

I think the other piece that's really important is having those use cases solidified and really not just understanding that upfront use case, but understanding like, if you automate that use case, how much time are you truly going to save? It's so easy right now to go out there and have any number of companies get you started on a POC or a pilot and start testing something out, and we may have a few use cases that we think are slam dunks. But at the end of the day, are they truly saving time and are they truly going to provide that value back to you? And it's important to understand that first, too, before we get into tool selection.

Speaker 3:

Yeah, and to that point and Brian, I know you and I kind of went through this a little bit last week when we were together talking to a number of clients here at Advisory Board I think the funny thing about this is all these customers are struggling, or all these enterprises are struggling to get adoption of the tools that they're actually providing their workforce.

Speaker 3:

But if they went out and actually surveyed their employees, how many of them are probably using consumer stuff on their own the chat GPTs, the clouds, et cetera, like I'm sure that number is actually a lot higher. So it's kind of funny that the workforce is actually using these technologies. They might just not be using the ones that the organization might have provided for them. And so it's really around kind of getting over that hurdle too of how do I get you to use my stuff? Because I have to ensure that it's more secure, that there's governance, that I'm not having that IP or data leak which go out to the public-facing chat GPTs of the world, and so it is a big struggle for corporations to do this right now. There's this weird balance of they've got to give out the right tools to the right people but also make sure that they are secure and in compliance with their corporate standards too. So it's a weird dynamic going on, for sure.

Speaker 1:

Yeah, it is interesting that, on the one hand, these employees of these organizations are already using these tools available to them, whether it's a cloud or GPT or whatever it might be, and at the same time, organizations are having trouble driving that adoption. What's the disconnect there? I know we already kind of went over it in the first answer, but is it that the tools aren't rolling out fast enough to the employees, or what's the gap?

Speaker 3:

There's probably a mix. There's probably IT can't keep up with some of the tools that may be on the consumer side that employees are already using, as well as maybe from a governance and compliance perspective. They're still going through their evaluations to make sure that they get the right tools, the right policies and so forth, and maybe it's kind of slowing down the deployment of some of these tool sets. I also do think kind of to Kate's point around do the employees know when to use what tool?

Speaker 3:

I think one of the issues that we also see is there's sometimes too many tools that people have access to and they're not really sure what's the right tool for what type of job too. And so you know you might have Microsoft Teams that's doing your call of transcription and your note taking and so forth, but maybe they're going out to GPT to do, hey, summarize this document or create a response over here, because they don't see where maybe one tool might have all the benefits that they're trying to these other tools. So I think that tool complexity still is probably a problem for a lot of organizations.

Speaker 2:

Yeah, yeah, I think I would add. Certainly you mentioned IT having a hard time keeping up. Part of that is intentional because they're not sure you know the structure of their data and what kind of issues they're going to run into once they do plug some of these tools that are incredibly powerful into their organization. You know what's going to happen with their data. Do they have appropriate data tagging? I know that's been a topic on the pod before, especially, you know veteran pod guest here, ina, is on all the time talking about data because it is so critically important and I do think you know customers just being sort of unsure of what status their data is in is part of what is slowing down their ability to kind of roll these things out broadly and really get true training out for their employees as well.

Speaker 1:

Yeah Well, we talk about how many tools there are available to all of us, not only from the consumer side but from the enterprise side. You're talking about co-pilots, coding assistants, enterprise search tools now popping up, and that's you know. I'm sure there is dozens upon dozens of other tools available from an enterprise standpoint or the organizational standpoint. Is there a smart way to bucket those so that we can like start to think about it more strategically? Joe, I think I've seen before there's you know, kind of four categories that we think about these enterprise tools.

Speaker 3:

Yeah, this is actually something that my team has been kind of working on for the past. Sheesh, almost year or so by now and I think because this segment is relatively new, I think that the categorization of the tools is just now starting to happen. You mentioned enterprise search. There's obviously the coding assistance, but really we're seeing more of there's the productivity side of the house, and then there's the creative side of the house. There's a lot of tools hitting the market all at once.

Speaker 3:

I don't think that the analysts, such as Gartner I don't think they've even gotten a chance to kind of grasp say here's all the different categories that there really are and have true definitions behind them. I can tell you, I mean we're keeping a track of all the vendors out there. I mean, every day we get, we get hit up on hey, have you looked at this one or that one? We probably have a list of 150 or so startups or even legacy software companies or manufacturers that are now coming up with AI built in natively to the tools that we're looking at here. So the market is just changing so fast that I don't think we have had time to start saying what category is this really right now? And that's really how we came out of this notion that we call workforce AI.

Speaker 3:

The concept is pretty simple. It's basically any type of AI tool that might affect the workforce, the employee base, and that could be a knowledge worker, it could be a frontline employee, it could be a back-end IT, it doesn't matter. But it's really around. What are the tools that are now starting to get to the workforce, and how do you start leveraging them, adopting them, figuring out the right use case? How do you prioritize one over the other? That's really where we're trying to spend a lot more of our time these days, just because we know that our customers are all struggling to keep up with this market these days, and so it's tough. But you know it's here and so we're. You know. We're just making sure that we can stay on top of it. Yeah.

Speaker 3:

I know Kate spends a lot of her time looking at all these vendors.

Speaker 2:

Yeah, joe, I've seen some of the trackers and some of the diagrams, and your eyes might start to do circles and do spins when you see some of them right. Your eyes might start to do circles and do spins when you see some of them right. Everything is just moving so incredibly fast and we are seeing drastic strides, even week over week.

Speaker 2:

At times, I think that goes right back to like, hey, what might be slowing enterprises down? Some may be waiting on that killer app. That might be the one thing that solves everything, but in my opinion, that's something that's going to end up having customers get left behind from getting value out of these AI tools, automating different workflows, asking employees to work on higher value items. You sort of have to sort of pick a direction and move and, at least from my opinion, the best way to pick a direction is to really start to truly understand your use cases and bring your center of excellence together and have your stakeholders all aligned to solving for specific use cases and then kind of let those use cases and the value that those use cases are going to provide you guide you towards a tool selection.

Speaker 3:

Yeah. So, brian, let me kind of talk about an internal use case that we've done around worldwide. So we've and I'm sure you know you've talked about this on the podcast before, but we've been going down the AI journey for a number of years. Our CEO, jim Cavanaugh, has been a very big component of AI is going to affect everyone. How do we make it internally? How do we start using these tools? How do we become more efficient, more effective Internally? How do we start using these tools? How do we become more efficient, more effective? So one thing that we started looking at internally it's like, okay, you know, look at different departments, look at different ways that we could deploy things like COPPA to the masses.

Speaker 3:

But a use case that we saw was specifically in our RFP team. So we've got, as you can imagine, we get a lot of RFPs from our customers. I think right now we're averaging something like 500 RFPs every single year, right, and so we said, okay, that's an area. It's very time consuming, it's very manual, but there's probably a lot of repeatability in what we have to do for RFPs. Right, you're intaking it, you're reading it, you're understanding the asks. There's probably some commonality in terms of how do you respond to some of these repeatable questions who are your company? What do you guys do Something like that? So we said, okay, let's look at that as a use case, and we built what we now call RFP Assistant where we can ingest an RFP. We can basically answer something like 80% of the basic questions in a relatively short amount of time, something that might've taken us maybe eight to 10 days and years past. We're now doing in, you know, as long as 24 hours, if not hours. So I bring that up because, as we talk about ways that you start building out use cases for the workers like it's understanding what their workflow is and that's how you can apply sort of an AI use case back to it.

Speaker 3:

Now and that's kind of our philosophy as we start to look at some of these things, you always have to go by the departments and say let's look at a workflow that you are involved in and how can AI maybe enhance it or augment it without it completely disrupting it? And so the feedback from that team that's been using this tool now for I don't know six or eight months now is to the point like they almost can't live without it as well. It's now saying, hey, wait a minute, we don't need to go. You know, maybe we had three people budgeted to go higher into that team. Now we don't need them anymore in that type of group because we've become more efficient in it. We can now measure the actual results out of it. Now we can go invest in other parts of the business that are maybe under-supported. Or maybe we're seeing like, hey, our content repository and that strategy needs to have a couple more resources. Let's go invest somewhere else in the business now. So that's how we're kind of evaluating workflows, as well as taking that ROI and maybe reapplying it elsewhere.

Speaker 3:

I think the other thing to note there, too, that's a great example of ROI. It's not you know, I think we talk about AI so much these days around like things like a chatbot or a digital assistant Really what that is. It's a workflow automation tool. It's leveraging AI in the background to do it, but it's really just an improvement upon the workflow and I think that's a really important step that people have to think about it. We get so hung up on these ideas and chat GPTs and chatbots, but AI can also be part of a workflow, not just how do I interact with a chatbot?

Speaker 2:

Yeah, I'd pile onto that, Joe, and just say how I'm using AI in some of my workflows. I mean, case studies are part of the business that we do, and in partnership with marketing, and you spend a lot of time interviewing sales teams, going back through notes and transcripts, remembering back to the start of a deal what the challenges were that were outlined and how a solution is overcoming some of those challenges. And now, instead of you know, having to spend several hours scheduling time to meet with folks, rescheduling those meetings because of people's calendars, doing interviews, maybe forgetting a key component, that really was sort of a big deal for the customer. But we were focused on a different driver of that opportunity Having AI perform those tasks for us. Where it's connected into all of that data it already has access to the transcripts that were saved from the meetings throughout the opportunity.

Speaker 2:

Reviewing all of those things to identify the challenges and the outcomes of the solution and things of that nature has saved just an absolute, incredible amount of time. I would put it at potentially days of time savings within there definitely hours, but I would say probably days of time savings within there just working that into the workflow. And then there's also time savings from the sales. So, and then that's you know. There's also time savings from the sales team side, because they don't have to go revisit a conversation about a deal that's already closed. Necessarily, you want them to review it, for sure, after the fact, but they don't have to kind of go repeat the entire scenario and review all of those things with you and whatnot. So huge time savings there and just bringing these things right into your workflow is. It's been a job changer for me, that's for sure.

Speaker 1:

So those are great examples. I love how we have been tying, you know, these AI use cases back to business value, back to those real work streams that are going to be able to be effective, which is definitely something that I know that we advise our clients. Joe, I want to go back to the RFP assistant, because, while we're talking now about a lot of value to the business hours saved one business, as I understand it, when we first kicked off that use case, we didn't really know as much about the ceiling of that solution as we do today. So I'm wondering if you could articulate how did we express ROI, related to RFP Assistant, back to our executives so that we kept investing in it, knowing that that's a real conversation that a lot of our clients or organizations have to do out there as well?

Speaker 3:

Yeah, and in fact it's funny because when we do talk to clients, listen, some of these AI proof of concept some of them are concept cars. Right, not all concept cars make it to market. In fact. Rfp assistant. I think there was a little bit of time there at the beginning where, like, this thing's probably not quite working as well as we thought it was going to work, maybe it was hallucinating a little bit and maybe not providing the outcome it did. So it did take a little bit of tweaking to get to that point, a lot of testing and, you know, fine tuning things here and there. So that is part of the process. When you're going through some of these things, especially something custom built, like you know, obviously there's off the shelf and SaaS based tools out there, but, like when you're doing something custom like this, which in this use case, it needed to be custom built, we had to fine tune it, and so initially it was really OK. Let's start testing out the hypotheses. What is this doing for us? Is it saving us time? That's great. How much time is it saving for us? How can we start measuring that? And so it was. It was actively meeting with the team that was using it to say, okay, what is the benefit for this? How many hours is it saving for you? And then what we started finding was it wasn't just the hours from if you look at the ROI, it was what else are we actually getting out of this? I love to use and I think this is kind of a great anecdote to why this tool works so well.

Speaker 3:

We got an RFP from a pretty complex government agency and I don't know if any of you have done work in the government. You probably know these are very large, tedious documents to go read. I think this one was four volumes, so it was something like 80 or 100 pages, and if you know anything about government contracts, they speak their own language. There's terminology in there. It's not an easy thing just to go read, and so they gave us a pretty quick turnaround. I think it was like 10 days. So normally we probably would have said no to that type of RFP. Right, we didn't have time to get to it. It's too complex. We would have passed Because of RFP assistant. We plugged it in. We were able to get a turnaround time at least something like 80 to 90% completion within 48 hours. So like what that means is. It allows us to go pursue more deals that maybe we wouldn't have won. And so, yes, you can look at time savings. But it's not just time savings.

Speaker 3:

As we start looking at ROIs, it's how many more RFPs can we get to in a month? Now? What's our success rate in those RFPs? And I think we actually have seen something like I have to go back and look something like a 10 to 12% jump in our RFP acceptance rates, because we're getting better at the responses. We're getting more accurate there. We're also responding to more now. So it's actually creating new business for us to win because of this tool.

Speaker 3:

So as we start looking at because that's the number one question we get a lot of times when we talk about some of these tools with customers like what's the ROI behind it?

Speaker 3:

The ROI might start with one point but grow from there as you start understanding what else it can do for you. So what we thought was a time saver is now producing additional revenue for us through it, and so it's just. I mean, that's a great example. And now, because of the success we've had there, we're taking sort of our factory approach and saying, well, rfp is an example, but like, what else can we do in that same vein? So think unsolicited proposals, marketing content, business value reports. It's the same type of workflow but now we're repeating it for other areas of our business, looking to achieve the same type of metrics speed time, additional win rates, that sort of thing. It's starting to have that flywheel effect where we're growing the business because of these tool sets for us. So it's finally starting to happen where we are seeing it spin out more additional use cases and opportunities for us to continue to grow the business that way.

Speaker 2:

Another great use case there is like any regulated industry is going to have questionnaires and audits to respond to, can help to automate those responses, keep them up to date as well. And then, as Joe said, not you know, we're we are responding to more RFPs, but we're also responding to more relevant RFPs because early on in that scenario, we're uploading that RFP document and checking what is our, what's our past success in RFPs that have this request in them and understanding okay, these are the ones that we should be focusing our efforts on and these are the ones that maybe haven't been as suited to us in the past, and knowing where to focus those energies. So, not only doing more, but doing more relevant and specific ones that we're responding to as well. So that tool has absolutely been a huge business driver for us.

Speaker 3:

That is a great example. I mean, think about that If you've ever read an RFP. I mean, if you're reading 80 pages of content, that's going to take you some time and last thing you want to do is take a day or two to read RFP and then go. I don't think this is for us. You just waste a day to do that. This tool can quickly figure out hey, is this in our wheelhouse? Is this kind of the business we should pursue? So it's just a huge opportunity for us to use these tools because it just makes us so much more efficient. It's not wasting cycles and time and resources, just hugely impactful to our business to work this way now. So, yeah, it's been great for us.

Speaker 1:

Yeah, I mean, it's such a phenomenal instance of how we have built an AI-powered tool here within WWT. But it has all the key lessons learned right Tie it to business value, start small and iterate, understand what those ROI metrics are and see if you can expand them from there. And then, obviously not least of which, we are winning new business which is adding to that top and bottom line. And if anybody is looking for more information on RFP Assistant, you can head out to wwwcom. Tons of great content. We'll also make sure to put some in the show notes here, Kate, I am curious.

Speaker 1:

Joe mentioned that we built that on our own RFP Assistant and I'm thinking that's probably because at the time when we started building it, which was a while back, there wasn't as many. There maybe not have been any tools available for us, but these days there's a tool for everything. Walk us through the concept of, or walk us through the concept of, or walk us through the phase of buying versus building. When are we ready to buy off the shelf? When do we need to go and build something like RFP assisted?

Speaker 2:

Yeah, no, I think it's a great question. I think it's something that many customers are struggling with making those decisions. I think the build versus buy conversation is one that's extremely nuanced and can go in many, many different directions. But if we're just kind of thinking like high level and sort of the way that I think about things, I kind of think there's not as much AI talent in the job market as we would need to be able to build everything, so we do have to make some decisions here.

Speaker 2:

So when I am working with customers, I'm sort of saying, hey, if we're going to be looking at build, that is a lot of people, resource and money resource that's going to be invested in that. It's a long term cycle. There's got to be care and feeding down the road for that solution as well. So if we're looking at build and investing that much, let's look at building something that's going to create us a competitive differentiator in the markets in which we're selling into and allow us to get that leg up on our competition. And then, if we're looking at something that's like employee productivity, where there's absolutely an ROI associated to these solutions and there's time savings for employees, there's productivity gains to focus on higher value items. But that's a great opportunity to look at off the shelf tools like Glean tools, like Copilot tools, like Windsor, where we can go ahead and get moving much more quickly at a lower cost and get up and running.

Speaker 2:

The other reason we would want to look at build in the same vein that Joe mentioned the RFP assistant, is how much customization do you need within a use case? So if you only need, you know, minor customization, buy is another great option there. But if you need to customize something extremely specifically again for your vertical, for your environment, that's another reason to look at build. But there is right now a lack of AI talent in the job market and there needs to be a concerted effort to sort of upskill and reskill all the employees that are within these environments as well.

Speaker 3:

Yeah, the other thing I would add in there too, it's how much of that is your core IP versus sort of general Like you know, if you're doing drug discovery, that's your core IP. You're going to build all that stuff in-house. But if you want an AI system that's going to help you summarize emails, do some basic research for you. You know that kind of you know. You know some of the things we've already been seeing out of Copa, like go buy that it's. It's already prebuilt. It works pretty seamlessly integrated into some of your other data sources. But for, like I said, for legacy core IP stuff, where it's really your company's knowledge, that's where you're seeing a lot more use cases around the build philosophy. This episode is supported by Omnisa.

Speaker 3:

Omnisa specializes in data management and analytics solutions to unlock the value of information. Drive informed decisions with Omnisa's data expertise.

Speaker 1:

Well, whether you do build or buy, the chances of the amount of AI-powered tools that organizations will be using moving forward is going to continue to grow and grow and grow. Joe, you mentioned earlier the client advisory board that the two of us were at last week, and one of the comments there I thought was interesting and I'm going to read this here was you know, once that flywheel starts spinning, whose responsibility is it in supporting this going forward? Know, once that flywheel starts spinning, whose responsibility is it in supporting this going forward? And then the interesting part was I can't wake up five years from now having my IT team support 50 AI platforms or more. So how do we have our hands on this going forward, knowing that it's just going to expand?

Speaker 3:

Yeah, I mean this. This is where you start getting into things like COEs and governance boards. I know everyone loves talking about governance boards, but it really is aligning between IT, the lines of business and even the executive team around. How do you start looking at this more holistically? I think what happens a lot of times is everyone kind of waited for IT to give them the blessing and go do it and then, as we started getting into more things like shadow it and shadow ai, the lines of business went out there and got it themselves and all of a sudden, next thing, you know you've got you know, like I mentioned earlier around like ip and data leakage around that stuff. So it's it's got to be a partnership between all these teams to understand who's doing what and have some sort of policy and process around. What are we going to deploy? What are we going to integrate our data sources into? What is the governance and security behind all of this stuff? It really is a team effort.

Speaker 3:

You know one thing that we've started saying it's rather than you know just saying in governance, it's more of an acceleration.

Speaker 3:

Like it's, rather than looking at this as like how do we stop gap everything and more of how do we accelerate it to be better. It's a different mindset, but it does allow you to say, okay, we know we need to do this. We know that we have to have controls and limits on certain things. How do we collectively look at this as a company with the right stakeholders and the right people responsible to say, okay, here's how we're going to do it, here's our policy for doing it, and having that review board to say, hey, you can't just keep buying stuff off the shelf every week just because it's a hot new thing. There's got to be some level of inspection on how you're going to roll these things out. Because I agree and let's be honest, in five years, probably every software tool that we have access to will have a level of AI built in. It's just the nature of where the market's going and so it's understanding that and just having some rules around what you deploy and what you don't.

Speaker 2:

I think every tool that we're using might already have some level of AI built into it, let alone five years from now.

Speaker 2:

I think that that customer question from the advisory board might be accurate. We might wake up five years from now and have 50 different tools. Again, it goes back to we're not necessarily waiting for that killer app. One of the things that I like to do is think of AI more as like an employee. When you're hiring an employee, that employee has roles and responsibilities and tasks that it's expected to accomplish and achieve. You understand the KPIs you're going to measure success based on for that employee and you understand the data that that employee is going to have access to.

Speaker 2:

I think that's a really strong way to start looking at some of these AI tools as well. You may have 50 AI tools that you've quote unquote hired, but each of them has roles and responsibilities and things that they're responsible for accomplishing within. You know your, your environment and, as we keep going towards agents and workflows becoming agentic, you may have a subset of IT that's a team of folks who are managing all of those different agents, and there may be more than 50. At that point, I think that you know that that might seem more reasonable once we really start to break it down and understand what we're asking each of those tools and types of tools to do and perform. Then, where you see overlap, certainly there's tool consolidation. I think if you really understand what you're asking the AI tool to do, you know it's less about having too many tools to manage and more about just really understanding what you want it to accomplish for you.

Speaker 3:

So Kate went there. I'm glad she did, because one thing that we always used to start our presentations off with was AI is not going to take your job, but somebody using AI will, which I think is accurate. Today, we are now starting to get to that theoretical question will AI replace my job? And we are now starting to get to that theoretical question will AI replace my job? And we are starting to see I mean it's very slow right now that in some instances, ai is starting to replace jobs. As I mentioned earlier, we're not with that RFP assist one. Well, we didn't use AI to replace a job, but we also didn't hire additional resources because AI required us not to have to hire additional resources, because AI required us not to have to hire additional resources.

Speaker 3:

And I think Kate is going in a direction, which is something that we've kind of been saying around here, and here's my personal opinion on this. I do think IT and even it's funny, I even think recruiting at some point will have a bag of AI agents in their tool bag, and that's how we will start hiring in the future. I'm going to say hey, I need five head count for this specific task. Three of them are going to be fulfilled by an AI agent. What's the right agent for this, for this job now? And I think what happens is IT will end up being responsible for ensuring the quality and security of those agents.

Speaker 3:

But it's the line of business going to say here's the job that I need, um, or that here's the request I need. For what do you recommend I go use to fulfill that need? And I think that's where we're going. It's almost. I kind of started making this notion of like you're going to have a recruiter of AI tools at some point. That's how you're going to fulfill jobs. You're going to have someone fulfilling the human requirement. Yeah, I need a physical person to sit here and do this task. But at the other side, it's like hey, I need something in HR that does this repeatable task for me. I need an agent for six months to go do this for me, and I think that's where we're going to get to, whether that's in six months or five years. I don't know if the market quite understands that yet, but I think that's where we're headed. Kate, you agree.

Speaker 2:

I agree that that is most likely where we're headed. Where we are headed, I do. I also have been accused of being overly optimistic, but every time we see these technology advances in the past, we do actually see the job markets expand and they're fulfilled by folks who are trained on how to use those tools. So, at risk of bringing this full circle, back to adoption services, which came up early on, it is critically important that organizations are upskilling and reskilling their employees, because some of those jobs will be automated. But what company is out there saying like I just wish I could do every single thing I'm doing today and automate all of it, and I never want to do anything more? There's always going to be more to work on, there's always going to be a way to grow and higher value tasks for people to work on, but we really have to be bringing our employee base along with us on that journey, and that's where adoption again I don't know how else to say it it's critically important for these solutions.

Speaker 3:

Yeah, so it's kind of funny. So we just are. It's summer's winding down. Our intern just left.

Speaker 3:

So one of the tasks we gave her this summer was tell us how a college kid uses ai these days at school. Right, like, are they really cheating on tests and doing their mat, their paper with ai? And so she gave us a pretty good rundown of how she's using it to prep for uh, for tests, and maybe do some research on her own and, you know, summarize meeting notes and so forth. But the point being is you've got a generation who's been using this stuff for the past two to three years in their 20s at school. Now they're entering your workforce, so they are going to be a bit more enabled to use these tasks.

Speaker 3:

I think the challenge is it's probably not so much that generation that we're nervous about. It's the aging population that maybe doesn't want to go get upskilled to use these tools. That, I think, is probably of concern for a lot of organizations. Like how do I get my mid-level, my senior level employees to start understanding how to use this stuff to be more efficient? You know, kate mentioned the upskilling piece. Like that's a big concern for a lot of clients right now.

Speaker 1:

Yeah, have either of you heard the term FOBO? This was something that I just uncovered while I was looking over some notes here before this episode. Fobo, f-o-b-o just like FOMO, but it's fear of becoming obsolete, and I think that's what a lot of people are dealing with right now is, you know, it's just change that's happening, people are resistant to change, and it is a fear of being of your job, being obsolete. But are we seeing anybody? You know, either within, internally here at WWT, or organizations that we interact with, where they may recognize that FOBO and turn it into, turn it into FOMO of fear of missing out, which is driving people then to to adopt AI.

Speaker 2:

I would say that I think that might be the journey that we've seen over the last couple of years, I think, when a lot of these tools initially dropped, consumer adoption was just so easy. Chet GPT is one of the easiest tools out there for a consumer to just download an app and start using, like they're playing a game on their phone almost. And past technology hasn't necessarily been like that. So when consumers were seeing that rapid adoption of those tools, I think there was that FOBO, a great term. Didn't know it. I need to learn these so that that doesn't happen to me and turn into that FOMO. And so again, you know, joe, you mentioned, you know we're not necessarily worried about college kids entering the job force because they're already a little bit enabled. But that comes with an expectation of having tools available for them to use.

Speaker 2:

And again, enterprises 27% are getting value here. When they enter the job market they are going to expect that onboarding is going to be easy. Access to information within the organization to learn about the organization and what they're supposed to be doing and who their peers are is going to be available to them. And I think that's coming true for some of the aging population as well where they're now gone from FOBO to FOMO and they're looking to get enabled and they're looking to understand how they can get better at these tools so that they don't become obsolete and it comes right back to. Organizations need to have a secure way for them to do that, with company tools and a plan and training in place for them to learn how to do that as well.

Speaker 3:

Yeah, that's the one thing that we keep finding over and over again and I know we're kind of talking about adoption and kind of all that. It's training. You know, whether that's internal training from your internal comms teams, whether that's going out there and taking master classes in this stuff. I know my Instagram feeds littered with people saying, oh, you go spend 15 minutes a day on these microsites to go learn AI, but it really is taking the time and prioritizing this to upskill your employee base so that they know how to use these tools and the benefits to them. Something else I want to throw out there Brian, you and I had this conversation in the car last week In your role, creative services and content writing like that's obviously a skill that AI is starting to kind of get really good at, and you said, well, am I going to be obsolete in a couple of months or whatever?

Speaker 3:

And you called out something really important. You know, as we talked about the role of like prompt engineering, prompt engineering is being inquisitive and asking the right questions. I mean, I think it's sort of taking a skill set and applying in a different manner To your point. Being a researcher or writing articles by nature is being inquisitive and curious. I think that carries over in this new world around prompt engineering and prompt writing Right, but it's it's starting to understand what are those skills that that need to transfer in this new world that just probably make you a lot faster and more efficient.

Speaker 3:

That's the kind of mindset I think people have to start having here. It's what does that transfer over into this new world and how does it make me more valuable versus oh shoot, I'm at risk now? I think that kind of mindset is where people need to be probably trained a little bit as well. Maybe it's the softer side of those skills, but that transfer into this new way of working that I think people have to think through. It's not just how do I write a problem, but it's, you know, teaching people the new way there, and I think that's where a lot of organizations are still kind of going through.

Speaker 2:

Joe, as you're talking through that, I'm thinking futuristically too, like what does that do to change communication as well, and does it? Does it improve communication? How does it change the way we interact IRL, if you will, right Outside of the prompt engineering, as we're changing the way that we're communicating and prompting for information, minor tweaks can make big impacts to the results you're being given, and does that start to impact the way we communicate live as well? It's interesting to think about.

Speaker 1:

Yeah, I'm happy you mentioned that. The conversation we had in the car, joe, curiosity definitely something I think helps with you know the new age of AI, but you can't be curious unless you feel safe. And once you feel safe, then maybe you're open to the training. I'm happy you mentioned the training too, because I saw a funny yet probably serious study about training. This had to do with a co-pilot. It said nine out of 10 employees said formal training would be helpful. In other words, they were wanting it. But then the second half of it was seven out of 10, skip the onboarding videos, it's taking training. So I mean how training is kind of a notoriously boring aspect of this. So how can we make it, or how can an organization make it better to to accelerate that adoption?

Speaker 3:

You mean? You mean the mandatory trainings?

Speaker 3:

uh aren't, uh aren't fun so great, they're great, they're always training I, I, I agree, and you know, I, I know there is a science behind training and educational classes, in that, in that degree. Um, we, we kind of have different philosophies on this, whether that's in-person training, webinar training, short form, video training. I know even internally we do a lot of internal emails, right, hey, try this prompt, try this prompt, try this trick, go watch this video. You can push so far. Sometimes it does take the employee to understand it.

Speaker 3:

I have seen success where it's sort of that don't be left behind kind of mentality of I see my peers doing something really well and they have adopted it. Maybe they're the influencers of it. Find those people and help it grow organically that way. Right it's, you know you can. It's still going to provide all these levels of training. But once you see success, promote success. This group did it this way. Get everyone on board to do it that way, or understand how they use it and kind of copy them. I think people see when they see what good looks like from maybe high performers or early adopters of something and this is true for any kind of technology. But promote those early adopters in the organization and really highlight them to show, hey, this is what success can look like and copy that like learn from that.

Speaker 2:

I think, I think I think there's a missing question in that survey. Brian, I think, um, out of those nine out of 10 who said training would be helpful, I'd like to know how many of them knew the tool existed. I like to tell a story. I've got a friend who works at a small law firm just happened to know they're a Microsoft shop from questions I've been asked in the past. I was like, hey, are y'all taking care of or taking advantage of Copilot? And just what the heck is Copilot? Was the response. And they have access to it. It's deployed, it's available right on their screen.

Speaker 2:

And so you know, training is critically important, but I mean making sure employees are aware that these tools exist is also important. And then, one thing that I always fall back on as well are having champions of those tools throughout different lines of business, because I know when I have a question I'm probably not first going to a tool. I'm like pinging a friend. I'm like, hey, is Monday a holiday, or is that not one that we get right? And now there's AI tools in place for us to go get that information. But pinging a friend is sort of the first step that an employee might take to ask a question and if that person that they're pinging can say you know, go ask the AI tool this question, you're halfway there to making sure that there's more awareness that these tools exist and that employees are allowed to be using them in their job role.

Speaker 3:

And then guiding them towards all the types of training that joe mentioned as well can be the next step there too. Yeah, I think I have gotten much better at telling people go ask. If I get a question like that, I just write back. Go ask our internal tool, adam, or go ask copilot, or go ping you know glean for this. Like that's how you have to. I feel feel like I talked to my wife about my kids. This too Like if you keep doing something for them, they're never going to get better at it. I almost I do the same thing with our employees. If you keep asking me and I answer that I'm never going to train you on why to go use the tool, but if I start saying no, here's a better way to do it, start learning that. That starts enforcing that behavior for sure.

Speaker 1:

Yeah, I do want to recognize that we're coming short on time here, so we'll wrap up, but I do want to ask you both we touched on a lot of stuff here today understanding the landscape as it relates to what tools are out there, the Bivers build strategy, what to do to drive adoption, all of that kind of encompassed up into workforce AI. Joe, we'll start with you Just last thoughts on what organizations should be doing right now to put themselves in position so that they're winning in the next 6-12 months as it relates to enabling their employees to utilize AI.

Speaker 3:

I think, if anything, it's prioritizing which use cases you want to attack. And then once you say, hey, these are the top three or five, it's building out. You know, once you get past the security and compliance of the use cases, the fourth, it's really getting to that. How do we measure the ROI? How do we do we have a strong adoption plan to drive general, you know adopted, you know adoption of that tool, and also have some metrics to measure along the way so that you can, to my earlier point, promote the success of it. So like, if we're trying to save time, you know, reduce costs, whatever it is, make sure those are well known and you can measure them. And then that way it's easier to go from okay, we've piloted it, it was successful, Now let's get out of pilot phase and keep growing it and keep expanding the adoption of it. But that's how you're going to grow something like this. That is pretty new to the workforce.

Speaker 2:

Yeah, I would say don't wait. I think everything is moving extremely quickly. It's a confusing landscape, but the opportunity is in front of us to evaluate our use cases, make a decision and take that first step, and you may have to shift that down the road if another capability is going to come out. But the longer you wait, the further behind you're going to end up being. We talked about. You know data governance and security being really, really important here. The longer you're waiting to identify a tool and a use case that you're going to attack, the longer you're waiting to look at your data and security. So you're going to be in the same position. You know, six months from now, a year from now, maybe, maybe there is some tool out there that you know just ends all the others, but you're still further behind because you haven't done all the other pieces there. So I think the number one advice that I would have is don't wait.

Speaker 3:

And real quick on that too, and Brian, this kind of came up last week. Like these don't have to be the grandest of ideas in implementing AI in your organization. Like you don't have to completely change your order. Like start with the boring stuff, right, coding assistants significant impact to your development community just by implementing a coding assistant. There's off the shelf tools like Windsurf or Coder and some of these other ones that are really good, easy to implement, that you can quickly produce an ROI. And so don't think that every AI you know, proof of concept or use case has to be so complex. Like start with the small, basic stuff, because you'll get. You can see significant impacts from some of those, but just prioritize which ones you want to start with and go from there.

Speaker 1:

Yeah, Start from the layups before you want to start with and go from there. Yeah, start from the layups before you go for the grand slam dunks there. So good words to end on Kate, thank you so much for joining, and Joe, thank you too for joining. Thank, you.

Speaker 2:

It was a blast. I was excited to be here.

Speaker 1:

Yeah, we'll see you next time. Okay, lots of great context from Joe and Kate. Thank you to for joining. A few key lessons I'm taking away. First, adoption is not automatic. Rolling out AI tools without clear use cases, training and trust leaves them unused, even as employees turn to consumer tools outside the company walls. Second, roi grows with refinement. The most successful AI deployments start small, iterate and then expand, often uncovering benefits far beyond the original goal, from time savings to new revenue streams. And third, strategy matters as much as the technology. Whether you buy or build, the key is aligning on priorities, ensuring governance and investing and upskilling so every employee, from new hires to seasoned veterans, can thrive alongside AI.

Speaker 1:

If you liked this episode of the AI Proving Ground podcast, we would love it if you gave us a rating or a review. And if you're not already, don't forget to subscribe on your favorite podcast platform, where you can always catch additional episodes or related content to this episode on wwwwwtcom. This episode was co-produced by Naz Baker, cara Kuhn, marissa Reed and Ginny Van Berkham. Our audio and video engineer is John Knobloch. My name is Brian Felt and we'll see you next time.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

WWT Research & Insights Artwork

WWT Research & Insights

World Wide Technology
WWT Partner Spotlight Artwork

WWT Partner Spotlight

World Wide Technology
WWT Experts Artwork

WWT Experts

World Wide Technology
Meet the Chief Artwork

Meet the Chief

World Wide Technology