AI Proving Ground Podcast: Exploring Artificial Intelligence & Enterprise AI with World Wide Technology

From Pilots to Productivity: Unlocking the True ROI of Microsoft Copilot

World Wide Technology Season 1 Episode 34

Millions of employees now have Microsoft Copilot at their fingertips, but only a small fraction of organizations have scaled it successfully. In this episode of the AI Proving Ground Podcast, World Wide Technology's Mike Davis and Softchoice's Craig McQueen share hard-won lessons on driving artificial intelligence adoption, overcoming change management challenges and moving from pilots to measurable ROI. If your Copilot rollout has stalled, this conversation could be your roadmap forward.

Support for this episode provided by: Logitech

More about this week's guests:

Craig McQueen, Vice President for Digital Acceleration provides strategic direction and leadership in driving the creation of highly differentiated, customer-centric service capabilities and offerings for Softchoice. Craig and his team bring to life Softchoice strategy by painting a vision for technologies we take to customers, lead the development of the services required to help our customers adopt the most secure solutions across cloud, data center, collaboration and the digital workplace.

Craig's top pick: Microsoft 365 Copilot Wave 2: Spring 2025 Release Overview

Mike Davis has more than 20 years of experience in the IT services industry, specializing in voice and video technologies from Microsoft, Cisco, Zoom, Poly, and Pexip. With a business-driven approach, he possesses extensive knowledge of professional services to guide cross functional teams in diverse projects from execution to completion and is recognized as a trusted advisor for public and private sector customers throughout the technology life cycle.

Mike's top pick: Digital Trends Spotlight: Key Insights on Workforce Productivity Tools

The AI Proving Ground Podcast leverages the deep AI technical and business expertise from within World Wide Technology's one-of-a-kind AI Proving Ground, which provides unrivaled access to the world's leading AI technologies. This unique lab environment accelerates your ability to learn about, test, train and implement AI solutions.

Learn more about WWT's AI Proving Ground.

The AI Proving Ground is a composable lab environment that features the latest high-performance infrastructure and reference architectures from the world's leading AI companies, such as NVIDIA, Cisco, Dell, F5, AMD, Intel and others.

Developed within our Advanced Technology Center (ATC), this one-of-a-kind lab environment empowers IT teams to evaluate and test AI infrastructure, software and solutions for efficacy, scalability and flexibility — all under one roof. The AI Proving Ground provides visibility into data flows across the entire development pipeline, enabling more informed decision-making while safeguarding production environments.

Speaker 1:

From Worldwide Technology. This is the AI Proving Ground podcast Today. Microsoft Copilot it's everywhere in the headlines and inside most organizations that run on Microsoft 365. The promise is enormous Smarter meetings, faster documents, less time lost to email and the potential to rewire how entire teams work. But here's the paradox Companies are seeing real gains, yet adoption still lags. Most are stuck in pilot mode, even as early users report measurable productivity improvements.

Speaker 1:

Joining me are two leaders who've been in the trenches of co-pilot adoption Mike Davis from Worldwide Technology, who's seen firsthand how enterprise teams wrestle with both the promise and pitfalls of AI assistance, and Craig McQueen from Softchoice, who's helping clients rethink not just their tools but the way they actually work. For context, worldwide Technology acquired Softchoice at the beginning of this year, and the two companies are helping organizations of all kinds, from Fortune 100s to mid-market companies across industries, adopt AI solutions like Copilot. Together, mike and Craig will help us unpack why Copilot adoption is so challenging, what it really takes to move from pilots to impact, and how organizations can turn this tool into a true driver of transformation. So let's jump in. Hi, craig, welcome to the show. Good to see you. Brian and Mike as well. Welcome to the show. How are you doing?

Speaker 2:

Good Thanks for having me, Brian.

Speaker 1:

Excellent. We were talking about Microsoft Copilot here today Adoption, utilization, tools and features, things like that. It's interesting. I feel like Microsoft Copilot might be one of the more intriguing AI topics unfolding today across millions of organizations, all of which, or many of which, are utilizing Microsoft products on a day-to-day basis.

Speaker 1:

The statistics that I've found and I'm going to read here a couple of bullet points that I prepared are while confounding, they tell an interesting story. So Microsoft itself reports a 70% increase in productivity and completing tasks 29% faster on average for those using Copilot, and another Microsoft community report found that nearly 70% of Fortune 500 companies have integrated Microsoft 365 Copilot into their daily workflows. So those two stats and yet broader adoption seems limited. A June 2025 Gartner survey of 215 IT leaders found that 72% of companies were still in the pilot phase for Copilot and only 6% of those had achieved global deployment. And then, just to confuse everybody a little bit even further, the same Gartner survey found a paradox 94% of respondents reported measurable benefits from Copilot, but they were still slow to scale it up. So that was a lot right there, craig. I'm going to start with you. Where is this paradox and gap coming from? And level set for us a little bit on where we are in terms of an industry as it relates to adopting Copilot.

Speaker 3:

Yeah and Brian, hopefully I can shed some light on this. And a key phrase you said was integrated into workflows, because Copilot's just a tool, and a tool by itself doesn't achieve anything. Yet if you think about how employees work today and how they might work differently with AI and then use that tool, copilot, to integrate how they can work differently, that's when you get the benefits and that's exactly why you see this surprising diversity in results. There's those organizations that just give the licenses to people and maybe do a lunch and learn and hope that things will be different, and they won't be. Other organizations say okay, our sales reps, we know that the sales workflow can be different and we can design a different or recruiting, they know, from job description through to follow up. Co-pilot can be designed in each of those areas and so when you operate on purpose, integrating, change management and adoption, you'll get those business outcomes. If you hand out licenses and do lunch and learns, you'll be disappointed.

Speaker 1:

Yeah, Mike, what are you seeing on the front as it relates to adoption? Why do we have this gap or just general kind of wide range of outcomes?

Speaker 2:

Well, imagine you're an individual contributor in your organization. Suddenly you show up to work one day and there's somebody sitting next to your desk and your boss introduces them as your new assistant. You had no introduction of that assistant before. You didn't know they were coming.

Speaker 2:

Suddenly you have one in your office and if you're used to doing things in a certain way every single day, to suddenly stop what you're doing and have to train a new individual on what you require from them takes a bit of time and if you're not prepared for that, you're going to end up kind of going back into old habits, not utilizing them. So these digital assistants in a lot of ways need to be treated like another employee in the organization. You know you're spending money per month on this feature, so you need to make sure you're getting the best return on your investment. So one you know making sure you have the proper use cases, like Craig was leaning to, you know that it's going to the right person who's going to utilize it properly and maximize the investment that they had in them. So it takes time to go through, figure out the different personas, the types of use cases that are going to be used for different teams throughout the organization so that one you can hand the license out to someone who's going to use it on day one and then back that up with training and preparedness.

Speaker 2:

Get the individuals and users ready for what is coming so that when the copilot is enabled for them on day one, they immediately know how to start utilizing it. And once you start, you're looking for that sort of that aha moment, so to speak, where as you're using it, you're starting to realize oh hey, here's a new way I can use it that's actually saving me 20 minutes a week. Oh, here's another way that I'm discovering it's saving me an hour a week on certain things that I'm doing. So it might be small little things that you're using that AI assistant for at the beginning, but that time starts to add up and over the weeks and over the months as you're using it, it just becomes second nature and the next thing you know, you're in sync with your AI assistant.

Speaker 1:

Yeah, I mean fantastic analysis on just the reasons behind why, and we could go in so many different directions right now. We've covered, you know, we talked about the gap being caused by, by integration and potentially some technology issues. Obviously, we're talking about change management and people. We're also talking about processes and strategy. Let's unpack a little bit of both or maybe start with the people phase. You know, craig, I am curious, you curious. What do organizations need to rethink about change management or training their people as it relates to Copilot, to help spur more adoption and more utilization? Is there something that they need to do right now to change?

Speaker 3:

no-transcript going to do it and lead by example, by sharing examples of how they're changing the way they work. That's absolutely crucial because, just like with any change, if there's not executive sponsorship and a reminder that things are different, they won't, and you know picking up what Mike was saying was, you know you have to provide that leadership and direction to make sure they're different, and that tends to be one of the most important things to bring AI into a department, within an organization.

Speaker 2:

Well Craig was talking about with sharing. That is, I think, very critical Within Worldwide. When we rolled out Copilot, sharing was a very big aspect of that. We had chat groups that were stood up where people could share how they're using it, issues they were running into or proper ways to prompt, which is a very big part of using AI assistance being able to prompt correctly in order to get the right outputs that you're looking for. So I'd say the whole sharing, once you've got that executive leadership, of course, to open it up to all your employees where they're actually starting to collaborate on this new tool and how it's being used. Really, at least for me, it really started to kind of open up doors on how I can use these AI assistants in different ways that I never even thought of before, because someone else had already sort of discovered it and shared it with me.

Speaker 1:

Yeah, mike, can you double click? I'm glad you brought up the use case of. You know what we've done here at Worldwide Technology. You know we've rolled out Copilot. I use it all the time. I know a lot of folks on my team and within our organization use it. What other key lessons learned, whether it be pitfalls to avoid or successes that we've had that helped drive adoption beyond what you just mentioned? What else could others learn from us as it relates to driving adoption of Copilot?

Speaker 3:

Yeah, brian, one of the most important things that we found is make sure that an identifiable metric or outcome is defined, and what that does is really grounds on. Well, what are we trying to achieve? So, within SoftChoice for our sales organization, we did a pilot to find out. Okay, well, does this make a difference? So we took 90 of our 500 salespeople. We know, based on years of data, and anybody in sales knows, that activity leads to results. So we knew that in a week, if there's more customer meetings, that will lead to results.

Speaker 3:

So part of the design of our rollout was to enable more customer meetings that have quality and we measured that. And I highly advise don't create a new metric, because it's hard enough to put a new metric in an organization. Pick an existing business metric and, sure enough, we found that those salespeople who were trained and had the change management around Copilot, they had over 30% more customer meetings in a week. And because we focused it around Copilot, they had over 30% more customer meetings in a week. And because we focused it around a metric, we felt that it was very clear then when we wanted to expand to the rest of the organization, the ROI of it, because although we sell millions and millions of dollars of Microsoft licenses. We have to pay for Copilot, just like everybody else of dollars of Microsoft licenses.

Speaker 2:

We have to pay for co-pilot just like everybody else. Yeah, with worldwide. I think it was a little more putting them out to the masses and individual groups to kind of see what's stuck. And then we monitored constantly to see where it had the best impact on specific teams and then from there we also were able to kind of see where we needed additional training to sort of follow it up. So if we notice specific groups not utilizing it much, we would reach out to those groups, find out why was it working well for them? If it was working low for some members of the group and not being adopted by all, dig into that and find out why. And a lot of times it was more of a training approach issue. You know they just weren't trained on properly how to use it or how to start using it and people just again start falling back in the old habits if they don't really find the use case for it immediately.

Speaker 2:

It really kind of took those people who really like to tinker and play around with new tools that really went out and sort of discovered what it could do.

Speaker 2:

And even today, daily we're still sharing information on some of our groups and suddenly discovering a new aspect of Copilot or hey, here's a very interesting prompt that you can use that gets a really kind of cool output. So it's constantly changing, constantly developing and with some of the new features that came out with Copilot, especially some of the research stuff, that to me has been a pretty big game changer because that can go out and do a lot of external research, not necessarily against internal data, but like craig was talking about sales teams and all I can do research on my customer. Um, I can gather more information I have readily available to me internally within our own data sets and I'm to be more properly prepared when I'm going in to have that customer meeting or if I'm researching technical aspects of specific products that I'm pitching to a customer, I know what their competition looks like. So that whole, with the changes and all that are coming to Copilot constantly, they're always adding new things and I'm always discovering new features and functions.

Speaker 1:

Yeah, craig, I'm glad you mentioned kind of that peer learning. I saw a recent study and I mentioned this on a previous episode of the AI Proving Ground podcast but it was a study that examined co-pilot adoption and found that most users are ignoring formal onboarding materials. Nine out of 10 said formal training would help them, but then seven out of 10 were ignoring or skipping the onboarding videos and opting and preferring more of that peer learning or just trial and error and tinkering. Is there a way to formalize that you know that peer learning or the trial and error without losing any of the speed that you might need to drive adoption?

Speaker 3:

Yeah, the way we usually do it is having a champions program, and so, as you mentioned, people tend to like to learn from their peers, and when you look within an organization, there's usually those early adopters that people look to. You know what are they doing, and we formalize that. So when we design a change management and adoption program, we make sure that we identify those champions. You know they're happy, kind of, to be elected and recognized for that, but then we give them some responsibilities on what it really means to be a champion, and so then you get that blend of yeah, there is, you know, material to learn how to use it, but then the social sharing within the organization seems to be pretty effective too.

Speaker 1:

Yeah, craig, I want to stick with you here for a second because I do want to get into use cases and how we talk about ROI. But first, you know I and maybe this isn't the first step, but I do feel like a lot of organizations are, you know, faced with a licensing question as well. So how do we think about licensing Copilot effectively? How do we balance the cost about a widespread rollout versus starting small? Who gets access first? That type of thing? How should organizations be thinking of Copilot in that regard?

Speaker 3:

I'm so glad you brought that up, brian, because people are having a missed opportunity to actually protect themselves a bit. And what I mean by that is, if you ask a CIO hey, do you think that any of your employees are using a consumer-based AI assistance such as DeepSeek or Grok or Plexity? And they might go, maybe. And so what I'd highly advise as a very first step is to get safe, and what I mean by that is Copilot Chat is free pretty much for any Microsoft customer. So you asked about licensing costs. Well, actually, you can get people safe first by giving them access to an enterprise-grade, protected AI assistant. Now you still need some change management, because you'll need to share with people. Hey, here's why you should be using this one instead of deep seek. But that gets the entire organization safe with a copilot chat. So that's usually what I recommend is a very first step. And then, as you said, all right, all right. Well, the single scope pilot chat is it's not connected with the microsoft graph, microsoft graph being all of your documents, emails and teams. The benefit of that is you don't have to worry about internal security because it's not connected. You're not going to surface that spreadsheet that had all the employees. Um, you know salaries, which people are. You know, you hear that story around, so you're protected by not being integrated, but you're not getting the value of having all of that enterprise data available within your AI system. So then the next step is all right.

Speaker 3:

Well, when we want to really enable people by having access to the graph, who should that be? And this comes back to our early conversation of, I suggest, who's the strongest executive sponsor. And it's so important, if you're going to spend the money on the licenses, to demonstrate some early wins, and we've seen this not just within organizations, but between industries. Some people will ask me, craig, which industry is moving the fastest? I'll say, well, that's not what I've seen. I've seen it's the one with the strongest executive champion that tends to move the fastest. So usually my advice is find an area where you'll get success. First design the program, buy the licenses for that group and then expand to others. This episode is supported by Logitech. Logitech designs peripherals and devices to enhance computing and communication experiences, improve productivity and collaboration with Logitech's user-friendly products.

Speaker 1:

Well, mike Craig mentioned that, how important it is to not only have that executive sponsor but then to quickly prove value on what you're doing with Copilot. What have we seen in terms of? You know, I hate to say low hanging fruit, but what are some some easier or more obvious use cases that can, that can demo that ROI or create momentum early on, like where should our listeners and and organizations out there be thinking to target?

Speaker 2:

Biggest impact I've actually seen is meeting recap within Teams. That alone is probably one of the biggest cost savings for myself personally. The amount of time that I used to spend hand scribing notes in a meeting so that I wasn't just typing away and then having to take that, digitize it later and type it all up was very time-consuming. I don't have to do any of that anymore. Copilot completely takes care of it for me and then within five minutes after the meeting, I can share the summarization and the meeting notes with everybody who was in the meeting. It saves me a ton of time. Per week, email recaps and things like that you come back from a two-week vacation, which a lot of people are doing now at the end of summer, you've got 1,000 emails sitting in your outbox. You can easily end up spending your first two days coming back from vacation just going through emails. Copilot now summarizes most of that for me so I can figure out okay, what can we wait until later and what's most important that I need to address immediately now that I'm back. Those are probably the two bigger things that I've noticed.

Speaker 2:

I was on with an engineer yesterday. He was just touting the capabilities within Excel, being able to ask Copilot how to do certain formulas. He said he never would have figured it out on his own. Within seconds he's got the formula he needs Code it. A lot of our coders are really getting into using it for coding support and all. But again, the 365 AI assistant aspect of it, with Teams integration with the emails and the rest of the Office suite that we use every day checking Word, docs and that sort of stuff, I think has a very big impact just right out of the box.

Speaker 1:

Yeah, craig, build on that. Mike mentioned the meeting recaps, the email summaries, and those are incredibly beneficial. I can attest firsthand and I'm sure many out there can as well. What else is there? How should organizations be thinking about those use cases which may be high volume repetitive tasks, versus maybe on the other end of the spectrum, some more transformational use cases? Or are we suggesting that they tackle the repetitive tasks first and then build their way up towards that transformation?

Speaker 3:

I think, chaining together those repetitive tasks into improving workflow. And I'll give a specific example. Within my team, we deliver co-pilot services for customers and our project management leaders said, well, I guess we better be really good at using it ourselves. So, on purpose, they looked at everything they had to do as a project manager, which includes the statement of work, understanding it, creating the preparation deck to kick it off. There's a lot of status recording and keeping track of numbers, and then at the end we usually issue a case study or a wind wire, and they looked at the string of everything that they had to do and designed how could we use Copilot in each of those areas. And so there's about 10 different types of pieces to that workflow and they track the time. What did it used to take and what did it take now?

Speaker 3:

There was an 83% reduction in those repetitive tasks and you might say, well, what did they do with the rest of the time? The fact is, we're kind of resource constrained and they have a lot of projects. They were able to focus on the customer and I don't know whether it's a coincidence, but our customer engagement scores have been the highest they've ever been, and so the time that they've been able to spend is less in the copy and pasting the repetitive tasks, but more engaging directly with the customer and putting their energies into there. Even the case studies it used to, you know I'd have to drag it out of them because it's something that they'd have to type up and everything else. At a touch of a button they can produce it, and so I'm actually seeing those published right away, which we then get to the sales team. So that's what I encourage organizations is how do you take these amazing individual capabilities, from summarization to preparation of information, and string it into a work stream that then changes the way that you operate within that group?

Speaker 1:

Well, craig, you're talking about time saved across that string of events, but in conversations that we've had with clients sometimes it's hard to articulate the value of time saved. Are employees putting that time saved back into other workflows, or how do you even understand what that value is in general? So what have we found in terms of success of articulating the value of quote-unquote time saved?

Speaker 3:

Yeah, and that's what you'll see in articles, particularly from the vendors, of the time saved. Now the fact is you'd have to do that on purpose, as a study, like a time and motion study, and it's not likely an organization is going to do that, which is why we emphasize, pick an existing business metric. Time saved hopefully shows up in that business metric improving, and so it could be customer satisfaction. So my example with the project managers the fact that they were able to save time on those you know soul-sucking, mundane tasks. They were able to reinvest the time and energy in customer satisfaction, and we measure customer satisfaction on a regular basis. So that's why, when we work with customers at the beginning, to put some focus on that and de-emphasize time saves.

Speaker 2:

Yeah, I'll go correct on that. It really kind of again goes back to trying to figure out what your initial use cases are. And again, when you're first going into this and you're looking at enabling Copilot, what are your top 10 things that you're trying to accomplish? Maybe you're going to narrow that down to three, to where you're going in. You're using these tools to sort of tackle those challenges. Like Craig was mentioning, you see immediate time savings with those. You get immediate wins and then it builds from there. It starts to snowball and builds momentum.

Speaker 1:

Well, certainly, based on a lot of the studies that we've seen, scaling the adoption of Microsoft Copilot seems to be one of the struggles that a lot of organizations out there struggle with. Are there any infrastructure or technology considerations there? Whether it's, you know, of Microsoft Copilot seems to be one of the struggles that a lot of organizations out there struggle with. Are there any infrastructure or technology considerations there? Whether it's data governance or cyber, what needs to happen from an IT infrastructure or just general kind of IT field perspective, mike? What needs to happen so that an organization, once they are through with that pilot phase, that they can scale it then appropriately within their organization?

Speaker 2:

Yeah, Well, I mean a great thing about Microsoft Copilot. It's all part of the M365 ecosystem, which means you're taking all the same security controls and everything that you've got today, which is why it's so popular. It's very easy for organizations to implement because they're already using all of the Office Suite today that it integrates with, so it's a very easy add-on for them. Add-on for them. Now you definitely have all the security aspects to it that need to be taken care of during that pilot phase as well, where you've got all the security, making sure that people only have access to the data that they're supposed to have access to.

Speaker 2:

That's the big scary part, I think, for a lot of our customers is, once they open it up, they're afraid of what they're going to see when they open it up.

Speaker 2:

So you know, taking care of all those security controls and implementing all that at the beginning is pretty key to it. But again, it's a pretty safe space. It's all within the Microsoft ecosystem, so it's fairly easy for them to enable and again, we can help with that as well if they need help opening their security teams or whatever. Once all that's done, it's all on your use cases and where you see the immediate wins, your use cases and where you see the immediate wins and we're seeing it everywhere from again people using the basic out-of-the-box functions of the AI assistance to more advanced customers who are out looking at those automated workflows and the extensibility side of it. You know, out-of-the-box it's great, but what kind of custom development can we do to integrate it even more with our workflows and our daily processes? And now we're starting to get a lot of customers asking about that. They've been using the AI assistance for a while and they're ready to kind of take it to the next level.

Speaker 1:

Yeah, Mike, stick with that. What does that next level look like? I know it's probably different from client to client, but in general, what does that next level look like for organizations and what does it help them accomplish?

Speaker 2:

Again it goes back to specific use cases that are looking to enable. But for most customers it's getting into more of the custom development side of Copilot, usually starting with Copilot Studio, so where they can build their own Copilots, they can build their own custom automation flows, those sort of things.

Speaker 1:

Yeah, Craig, anything to add to that scaling playbook. You know we talked about, you know, identifying use cases. We talked about having a strong executive sponsor. We talked about internal champions, talking about metrics beyond just time saved. What else should be in the scaling playbook for organizations, for Microsoft Copilot?

Speaker 3:

Yeah, if you have a good base of that, I feel the next area is business process automation and agents. So Mike mentioned Copilot Studio, which you can use to build agents, and I think this is a really untapped area that organizations haven't really started to reap the rewards. It's not too different than business process automation that's been around for a long time, except now you have a much more powerful tool that's more intelligent. So you'd still look at the way you would have done business process automation in the past. Document the business process, how does it work and what are the intersection points. But then you can design in how can an agent do different parts of that workflow and string them together, and I wasn't going to use the word agentic AI. However, that's exactly the direction that it goes and for me it's the next evolution of BPA.

Speaker 1:

Yeah, Craig, what else are we seeing? Kind of come down the line, put on or, I guess, get out your crystal ball here a little bit Over the next 6, 12, maybe even further out beyond. What type of evolution are we expecting out of Copilotpilot and where do we think organizations are going to benefit?

Speaker 3:

most? That's a tough one to answer, because this AI has been moving so fast and so surprising it's just what it's able to do is even shocks me as a computer scientist and engineer. What I will say is, as with any technology, you can apply that technology to do what you do today better, and you're going to get some good incremental results in your business. What really makes a difference is doing something new that you've never done before because of that technology, and so I'd say the areas that will be exciting are how will businesses change what they do because they have this technology, not just improve their business today?

Speaker 2:

So for me, one of the next evolutions that I'm starting to see.

Speaker 2:

I'm a collaboration person, so most of my focus is on voice and video technologies.

Speaker 2:

What I'm really excited to see now is this Copilot starting to integrate itself into not just my desktop but now into my conference space. So now, instead of me having to have a computer or device like that, I can walk into a conference room. Now that's running out Microsoft, a team room device and I can utilize Copilot within the room. If I want to have an in-person meeting that's not even in a conference call, I can bring up Copilot on the video conferencing device. Now I have a scribe where traditionally I would have somebody in the room taking notes, copilot's doing that now for me. So, as it starts to sort of integrate its way into the collaboration tools and also being able to notice who speakers are, so when you're getting the output you know you have actual names behind the comments, because the AI knows who's speaking based on the voice, if it's trained properly. So those aspects of it, I think, are pretty kind of cool and groundbreaking from a collaboration aspect. So it's again kind of more my focus.

Speaker 1:

Yeah, we're running short on time here, so, and I do want to read, respectful of your busy calendars, but, mike and Craig, you know I'll start with you. Mike, maybe, as we, you know, start to creep into the back half of 2025, into 2026, what's one thing that organizations should be stopping doing now and one thing that they should start doing right now to help spur more faster adoption of Copilot, if that's their picked strategy.

Speaker 2:

Focus more on the individual work streams that you're trying to accomplish. That's probably the biggest one. I think it's been a very big shotgun effect with Copilot. They're just organizations are just kind of giving licenses out to the masses, sort of seeing what sticks. I'm starting to see people take a step back from that and organize a little bit better. And again getting back to what Craig's been talking about here, with having the proper leadership backing up, having the right champions who are going to help drive it throughout the organization, and then focus on those individual work streams and wins, one at a time, and before you know it you build a huge arsenal that you can utilize co-pilot for within your organization. That's saving multiple groups a lot of time overall pilot for within the organization.

Speaker 1:

that's saving multiple groups a lot of time overall. Yeah, craig, same question what should we be doing more of, or what should we start doing, and what are maybe some pitfalls that we should stop?

Speaker 3:

focusing on Sure. Maybe I'll pick a continue, which is organizations are working on their AI strategy, and that tends to be a lot of workshops, and it is important. What I highly recommend is get started with a project now. So if there aren't users in a limited basis using AI day to day, you're not learning, and so it's so important to learn from actual use, because that informs your strategy. If you haven't, on purpose, put a program in place, even if it's a small set of users, to change the way they work, that should be something that they should focus on.

Speaker 1:

Yeah, great Well to the two of you. Thank you for taking time out of your busy days to share what I know is a very important topic for a lot of our clients and a lot of listeners out there, which is how to integrate and move faster with Microsoft Copilot. So thank you again for joining. Hope to have you on the show soon. Thanks, mike and Brian, it's been great.

Speaker 1:

Okay, as we wrap up, three lessons stand out from today's conversation. First, adoption is about people. As much as technology Tools don't transform organizations on their own, employees do so. Executive sponsorship, champions and peer learning are essential to creating momentum. Second, start small, but start with purpose. The biggest wins come from identifying the right use cases, measuring outcomes against existing business metrics and using those early successes to fuel confidence and scale. And third, co-pilot isn't static. It will keep evolving, from summarization to workflow automation to intelligent agents. Organizations that treat adoption as a one-time project are bound to fail. Those that build adaptability into their culture will stay ahead. The bottom line co-pilot isn't just about saving a few minutes here or there. It's about rethinking how work gets done, aligning technology with people and strategy, and building a foundation for transformation that compounds over time.

Speaker 1:

If you liked this episode of the AI Proving Ground podcast. Please give us a rating or a review and, if you're not already, don't forget to subscribe on your favorite podcast platform, and you can always catch additional episodes or related content to this episode on WWTcom. This episode was co-produced by Nas Baker, cara Coon, marissa Reed and Jeannie Van Berkham. Our audio and video engineer is John Knobloch. My name is Brian Felt. We'll see you next time.

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

WWT Research & Insights Artwork

WWT Research & Insights

World Wide Technology
WWT Partner Spotlight Artwork

WWT Partner Spotlight

World Wide Technology
WWT Experts Artwork

WWT Experts

World Wide Technology
Meet the Chief Artwork

Meet the Chief

World Wide Technology