Are You Ready to Capture Value from Predictive Service Today? Take the Quiz and Find Out!
The organizations that are successful with implementing AI at scale, or those that are able to stack use cases and take advantage of compounding returns put a strong emphasis on quality data. This is not the exclusive domain of data scientists and AI engineers, but a responsibility that everyone must share in. Chris Joynt, AI & Analytics Translator joins us again to give his perspective.
Welcome to Speaking of Service, the podcast that uncovers practical ways to grow service revenue, control costs, and improve customer satisfaction. If you're looking to innovate, gain a competitive edge, or just learn about the latest service trends, you've come to the right place. In today's episode, Chris McDonald, head of AI and analytics sits down with Chris Joint AI and analytics translator to discuss the advantages organizations have that put a strong. Quality data. Welcome to the show. In this episode, we're gonna talk about how to successfully scale AI in terms of a smart, connected product strategy. How to deliver insights about optimizing your service strategy by thinking about data, data strategies, and in particular data quality. I'm very happy to welcome our guest, Chris Joint, an AI and analytics transla. Chris, joy, can you introduce yourself? Thank you. Happy to be here. Um, so like you said, I'm an AI and analytics translator and I've had the pleasure of talking to customers in many different industries about their smart connected products, uh, strategies, and, uh, trying to embed AI in the digital experiences that they are creating. Thank you, Chris. So, to start off, I want to talk about data quality. What does data quality. Um, what do customers understand it as and how should service organizations think about data? Yeah, no, good question. So data quality is one of those things that's not as, uh, binary as you might think, right? Um, a lot of times we, we say, you know, garbage in, garbage out, right? But data isn't necessarily garbage or not garbage, right? Uh, there's degrees. There's degrees of data quality. Um, so when we say data quality, uh, you know what, what I really mean by that is how completely and how accurately does the data that you have about an event or a process or an asset or a customer reflect that real world scenario, right? How, how completely have you captured that? And you know, how consistent, uh, you know, do you. As, as much of the information about that and how much do you trust that? Right. So that's really what we mean when we say data quality. It's not like, is this bit of data here erroneous or not? Right? It, it's not binary like that. It's a, there's kind of a degrees of data quality. So some might say data quality, it's an IT thing or an IT responsibility. Can you explain why it matters to a service technician or to an operator and what role they have to play in ensuring data quality? Yeah, good, good question. Um, so they're at the point of interaction. They're on both sides of it. Right. They're on both sides of it. Where we are transforming these organizations to be acting on data, right? So we know that the service technician, or you know, the, the remote technician or the operator or whoever it is, we know that they need to trust the data and they need to, they need to make their decisions based on that. Right. Um, so, you know, they have skin in the game, if you will. Right. Uh, and they should be, if they're doing things right, they should be collaborating with, uh, you know, the line of business with data science, with, with everybody. Um, to, to give their perspective on things when they don't trust something or when they say, yeah, you know, I got this prediction, but you don't understand. I have to balance it against these other things here. So I'm not just gonna, I'm not just gonna take that and do it cuz you said. Right. They have to be voicing that and, and bringing that back to the, to the, to the table. Um, but also they are, by nature of interacting with these applications, uh, and machinery, they are creating data, right? So, for example, that's, this is kind of a oversimplified, but let's say a, uh, an operator, uh, just hits the emergency stop. Right. We've had customers that have, that have done this in the past as opposed to slowing, slowing down the, the line and, and stopping it properly. They just hit the emergency stop whenever they feel like get well from a data standpoint, right? If I'm looking at that data set later and imagine I'm not talking to this operator. I don't know how they're doing it. I look at that data set later, I'm gonna go, well, these are two very different things, right? And these are gonna lead me to two very different. Uh, and, and ways to act on that data, right? So it's really everybody's job and they have to understand the importance of the data that they are creating. And they have to, you know, speak and communicate with each other about the data that they're expected to make decisions on. Now, that's great and there's, there's data quality at a point in time, right? When it either it's created, inputted, consumed, um, that is certainly important. But how do you. Ensure that data quality isn't degrading over time. Yeah. Good. Good question. Um, you know, there is a very well documented phenomenon of, of data drift, right? That, that happens in, you know, across, across all, uh, industries. You know, data drift is, is a real thing. Um, so what, what you really need to, there's, there's a few aspects to it, right? Um, you know, if you look at any of these models on how to. Uh, how to do ai, right? Or, or like CRISP DM, for example, is a classic, classic approach. They're all. Circular. Right? All of the mo they're all circular. And that, you know, means it's, it's, it's continuous. It's continuing to, to improve. And the thing that you always need to be doing is always going back and not only just saying, well, are my models accurate? Right. But you have to always continue to ask, are they helping me to solve this business problem? Right. Uh, so it's an, it's an ongoing thing and there's a number of activities that you can do. To keep that data of quality as you, as you're going, not just retraining models and, and, and things like that. You can monitor data itself to see if the data has, has drifted at all. Right? Um, and, and monitor features, if you will, to see if the, see if the features are drifting. Um, you can conduct ongoing error analysis. This is one of my favorite things, and I think everybody should be thinking about this and talking about this, is, you know, Once a month or quarterly or, or whatever makes sense for, for your business, sit down and review the output of your models and say, where were they? Right? Where were they wrong? Right? And let's see, is there commonality in, in the ones that, where they were wrong? Perhaps there's particular cases where the models are just really doing poorly, right? Perhaps, you know, uh, my model doesn't really predict. How my, uh, machinery is going to fare in, you know, really dry, uh, humid or really dry arid weather in the American Southwest, for example. You know, maybe it does well up here in Boston, but that model is, is failing That's right. In that, in that area. So perhaps I need to get more data. Perhaps I need to, you know, create specific models for, for regions. There's a million ways to remediate that. Once, you know, that's a. But, and to be clear, what you're describing, it can sound like a data science team activity, but really reviewing models in their outputs is a collaboration, it's a business activity. Absolutely. It's a understanding how these outcomes, or sorry, how these outputs are being utilized, what outcomes are they leading to? Is it correct or is it not? And it's an opportunity. For someone like yourself, someone who stands in between the technical and the business data science and business outcomes, to translate what the model is doing, how it's performing, what it means, and a great opportunity not just to ensure data quality, but to make sure that everyone is speaking the same language, that we're constantly translating what the data, what the analysis means. Absolutely. It's a team sport. Uh, thank you for pointing that out. Chris, join. Thank you again for joining us. I appreciate it. Thanks for listening to the Speaking of Service podcast brought to you by ptc. If you enjoyed this episode, please subscribe wherever you get your podcasts and leave a rating or review. And be sure to check out other episodes to hear new perspectives on improving life for aftermarket professionals, service teams, and the customers they support. If you have a topic of interest or want to provide feedback, email us at speaking of service ptc.com or visit us at ptc.com/speaking of.