Auditing with data: for Performance Auditors and Internal Auditors that use (or want to use) data

42. Michael DeCero, a Data Professional at TDS with an Internal Audit background

Risk Insights - Conor McGarrity and Yusuf Moolla Season 1 Episode 42

Michael DeCero is an Internal Audit Analytics Manager at TDS, a telecommunications company. 

In this episode, Michael explains how he helps his audit team use data. 

 Links:  


About this podcast
The podcast for performance auditors and internal auditors that use (or want to use) data.
Hosted by Conor McGarrity and Yusuf Moolla.
Produced by Risk Insights (riskinsights.com.au).

Narrator:

You're listening to The Assurance Show. The podcast for performance auditors and internal auditors that focuses on data and risk. Your hosts are Conor McGarrity and Yusuf Moolla.

Yusuf:

Today we have Michael DeCero from TDS. Michael is an Internal Audit Analytics Manager at TDS where he's been for over 12 years now, I think it is. Is that right?

Michael:

Yeah. I've been with internal audit for 12 years and specifically in the data analytics space in internal audit for the past three years.

Yusuf:

What brought you to audit and where you are right now?

Michael:

TDS stands for Telephone and Data Systems. We're a telecommunications company based in the Midwest in the United States, but we have got markets across the country. Wireline and wireless service for all your telecommunications related needs and cable companies as well and hosted managed services. What drew me to internal audit, honestly, was I graduated college with my undergrad in Accounting back in 2009. So that was right during the recession. It was really challenging to get a job. And I'm like, I gotta do whatever I can to get one. I was just telling the story to an intern that's in our group right now and I said, I was applying to everything. I just wanted to make sure I was making decent money somewhere. And if you recall, this is shortly after Sarbanes-Oxley became a thing in 2002. Accountants and auditors were really in high demand. And that's what drew me to it. The reason I've been in internal audit for 12 years is I've gotten really lucky in working with TDS. Not only is it rewarding work, working for a telecommunications company, but they really take care of me too. So that's the biggest reason I've been sticking around in this field for so long.

Yusuf:

You've been in the data area for the last three years or so. What was it about data that excited you or drew you into that?

Michael:

That's a good question. Early in my career, I was trying to identify what's life after internal audit? I was kinda thinking about if I want to continue to move up the corporate ladder, especially in internal audit, they expect you to go into the business and get some experience there. If you ever want to become a director or a VP, or Chief Audit Executive, they expect you to have some business experience. I was working with my leaders on what does that next step look like for me, and the word that kept coming back was accounting. And I said, I'm not interested. I did that my undergrad. And I started reading current trends, whether that's The Economist or other magazines out there, other publishing articles. A lot of good stuff on LinkedIn, I follow. And I'm just starting to get a sense that automation, artificial intelligence is a threat to the workforce and in finance, especially accounting. So I wanted to make sure that I kind of stayed ahead of that curve. And I said, I need to differentiate myself a little bit. I want to get a sense for what data analytics really means. I don't want to just say it as a buzzword, because I often think that it is just said as a buzzword. People don't understand really what it means or what it entails. So my company, my department set up a new data analytics function with an internal audit and, I stepped into that role.

Yusuf:

Okay. And what's your experience been over the last three years? How has your view of the use of data changed or your experience with data changed over that time?

Michael:

It's a great question. How it's changed over the time. I'd say there's some things that have stayed the same and there's some things that have changed dramatically. Some things that have stayed the same over the past three years. It's just as challenging to get our hands on good, reliable data as it was three years ago. Identifying where the connections are that we need to get data and ensure that we're getting what everyone kind of a buzz word. If you would say, in this field as the single source of truth, how do we agree with our stakeholders? What is that proper set of data that we all agree upon as complete and accurate, represents the data that we're looking for? And then agree upon what we should be seeing. What are the thresholds where something is good or bad? What is the threshold where something's considered an exception or not? That is all extremely challenging and continues to be, and probably will be for a while. I feel like all companies are trying to get more centralized and formalized with their data needs. Meaning how do we store the data in a coherent, cohesive manner? How do we govern said data so that we understand things like"What is the single source of truth?" And we're making progress. So although it's still a challenge companies are starting to realize that if they really want to get true value out of their data, they need to have good processes in place and good, warehousing, good maintenance to ensure that they can rely on said data. It has the right availability for people. And of course, we got to make sure we have it all secured, too.

Conor:

You spoke there about some of the challenges that were data focused in the three years that you were trying to set up this function within internal audit. Can you tell us a little bit about some of the people challenges or some of the changes that needed to be made maybe around the people working in internal audit, but also within the business when you were trying to set this up?

Michael:

Great. I'll talk first about the people. So when I first took this role three years ago, I was just an individual contributor. I was just kind of the sole data analytics guy on our team, but as needs grew, we needed to grow the team in order to fulfill those needs. So I hired a few individuals. Now we've got a team of about, I'd say two and a half. So a couple of full-timers and an intern. And we've had to be creative and strategic in our headcount and resourcing needs. Resources and headcounts just don't grow on trees. Luckily, I've worked with my superiors and I've had a lot of support to convert some of the operational and financial audit positions that have been vacated over the past three years and convert some of them, not all of them, to our data analytics function. As far as process challenges go, and this does include people. You know, it's changed. I'd say vast majority of individuals out in the business world. Especially in finance, I feel are a little fearful of anything data, really. I don't necessarily think it's just like they're concerned about losing their jobs to automation or artificial intelligence. I think it's more just like heavy mountain of knowledge and skills that you need in order to really do this type of work. I think that that might be a little misguided. Don't let fear stop you. That's been a big hurdle is trying to convince people, hey, give us the time and space. And we can really do some great things with the data that we have in our hands instead of us selecting the sample of 30 odd objects to test and report on how of a population of hundreds of thousands, if not millions. Maybe we can, in fact, test the entire population and get much better insights into that data where our stakeholders will appreciate and respect and act upon our findings better.

Conor:

So you mentioned there that in setting up your team, you were able to convert a few of the positions that were vacated. It'd be really interesting to understand what was your compelling argument to be able to get those positions enabled within your team?

Michael:

Yeah. Like I said, I got great support from up top, but, you know, I had to sell kind of what we are going to be using these resources for. And the big thing that, although we didn't have hard numbers in our kind of business case, if you will, we got close to an estimate of how many hours are we expecting to save with some automation features that we're building out. So Sarbanes-Oxley, for example, here in the States has at least, our team, internal audit spends a third of our audit plan every year on compliance activities specifically for SOX. That's a repeatable process and that's, done every month or quarter every year. So we've started to build some workflows that will save the audit team a lot of time. And so, hey, all right. Instead of an auditor that's going to come in and do this every quarter, or maybe we can build a workflow that will do it for us, or at least maybe do, you know, up to 50% of the work for us. And showing that in order to really realize those gains, we need another resource because we have so many other requests coming in for ad hoc reports or analytics for the various audits that are in process. And we broke it down in saying, this is how much time we're spending on audit project requests, which is a vast majority of our time. I'd say 80% of our time right now. We're only dedicating about 10, 20% to Sox automation activities. Now, if we can get another head count here, we can really start realizing this, automation strategy and that's kind of been a selling spot.

Yusuf:

With the team that you're on and the use of data across audits. Is it that everybody comes to the central team to enable that part of their audits to be done or do some of the teams actually use data directly themselves?

Michael:

It's a combination of both. I'd say majority of the former. I'd say, if I had to estimate, it's probably 85% of the time they're coming to us with a, we have a specific request. We need you to run some analytics, if you will. Build some reports for us. Dashboards, et cetera. Or build a workflow to get to some sort of outcome, that's majority of the time. One thing I would say that we do that I'm really proud of is we try and build, we use Tableau a lot for our dashboarding. And that gives us an opportunity to have an dashboard that our operational financial teams can use. So once we coordinate with them and say, here's the data that we have, here's the outcomes we're expecting. Here's kind of the visuals we need to help us identify exceptions or something that looks strange or that we want to do further research on. Or maybe it's indeed to help us select our risk-based sample. We're able to build those Tableau dashboards that they could kind of filter and search on their own to see if there is in fact, some, you know, what I call monkey business going on in the data. That's been a real success because any data analytics shop, I think you want to promote data literacy in your department, whether that's internal audit or anywhere. And that's really, I think, driven a lot of that. Our internal audit team, I think that we, as the data analytics function, have done a really good job trying to boost that within our department. So we've hosted some SQL trainings with some select operational financial auditors, just to give them a little bit more understanding and ideas of other skills and tools that are out there to get them what they need.

Narrator:

The Assurance Show is produced by Risk Insights. We work with performance auditors and internal auditors, delivering audits, helping audit teams use data and coaching auditors to improve their data skills. You can find out more about our work at datainaudit.com. Now back to the conversation.

Yusuf:

What do you find to be the key challenge in working with other audit teams on scoping and planning their audits?

Michael:

It's a great question. Our team is specifically involved in what can and should a data analytics team give you all. And what I have found to be probably the biggest challenge is agreeing upon a clear enough, expected outcome for our team to deliver on. If we were in program development it's requirements. How do we land on and agree upon what are those requirements that my team is delivering on for any set audit project? I think that's just the nature of this. People who are not involved in data, maybe day-to-day, have a hard time articulating exactly what they want and know what they want. So it's always going to be a conversation that needs to be had. And indeed it's probably multiple conversations. But that's been my big push is I am not comfortable agreeing to any sort of timelines or deliverables until we kind of have a documented understanding of what those requirements are for any audit project. And it has been going well. We have some kind of common questions that govern this type of activity. What are you expecting? Do you want an Excel output or do you want a Tableau dashboard? What is the definition of an exception? Another real important one is what are the key attributes of the data set? So when we're getting a table of whatever, a million records or rows, and we have up to, I don't know, I've seen up to 50, a hundred, 200 columns. Okay. I am seeing column X as the one that we really want to base our analysis on. We need to agree with the business that we understand it correctly. And I've been working on getting my master's in data science, more specifically in artificial intelligence now. But whenever you're doing data science work, you're going to spend probably 70 to 80% of your time understanding your data and that's vital in any role. I don't care if it's internal audit or advanced data science. You need to spend that time to understand your data and know, okay, we are going to base this algorithm, this program, this model on column X. We need to know what column XYZ or whatever column it is. We need to know what that means. And we need to agree with our business partners that we have our understanding correctly. Until you do that, you really can't move on to your testing.

Yusuf:

When internal audit teams come to you with an audit that needs to be done, how much of effort do you have to put into explaining what is possible before you can get further into exactly what is then required?

Michael:

Great question. It depends. I'll give two examples. For something that might be straight forward. Like that we're doing a procurement project right now. We also did an expense reimbursement project earlier this year. You could come to some pretty clearly defined expectations. We want a dashboard of spend by vendor. We want a dashboard of spend by purchase order approver. That's easy. We kind of understand what they're trying to do. They're trying to get a better summarized view of the data so that they can understand it. Make sample selections. Drive questions, et cetera. But those are pretty simple. Recent example today that we came across was my team ran a, sentiment analysis on some survey responses that we sent out. And I think that this is a great practice. This we're going to try and implement maybe for as many audit projects as we can. As you're probably familiar, right? You have your planning meetings and field work, and you're having discussions with various stakeholders, process owners to understand their process and figure out what the risks are. Well for the procurement project that we're running, we actually sent a survey out to various individuals across I think we sent it to maybe over 200 people, I believe, that's involved in the procurement process in the business. And we had a series of both closed ended and open-ended questions. Multiple choice and then just free range questions. The closed ended questions are easy to report on. How many people selected A, B, C or D? You could report on that and get a sense for how things are looking. One of the questions we asked was, did you participate in the training that was given over the past, I don't know, X amount of years? Because the company said that indeed they gave the training to everybody. And now we're able to report that people actually take it. Pretty straightforward, quite easy. What we're learning now is we did some sentiment analysis on the open-ended questions that are just kind of free range. And if you ever have taken surveys like that in the past, you get responses. And there's usually an algorithm that kinda determines is it positive, negative, or neutral. And when we started looking through our results, we saw that we had a lot of errors, I'll just say. I don't know if you want to call them false positives or what have you, but no, we were saying something was positive and really it was more, not even neutral, it's maybe negative. Our model didn't quite, predict it correctly. And so now we're saying, well, what we really need to do is kind of have a data science approach where we select a sample of that population and train it first and say, hey, let me run it through my model. Let's see what the results are. Are they right? Are they wrong? Okay. For the ones that are wrong, let's mark them as, hey model, you did this wrong. Try again. This is where machine learning gets kind of involved in training your model to ingest said data. And then testing it to see, is it viable? Is it giving you the results that you're expecting? And then, and only then once you kind of understand that, can you move on to giving other data to deal with that kind of friction? And that takes time, you know. We kind of did a real quick turnaround and now we need to take a step back and be like, all right, if we do this in the future, we need to take, much more time. We turned this around in like a day or two. We need a week or two to train the data, test it and get to a more accurate model.

Yusuf:

That's something that we're seeing more of in a number of areas. And particularly where you have false positives that you want to eliminate, but also other areas. And it can really be very useful. But like you say, it takes a lot of time to make sure that you're selecting the right features and selecting the right algorithm to do the learning and the prediction, et cetera.

Michael:

Yeah. This is a really interesting topic because I've been thinking about with us auditors, we want kind of like black and white. Is this an exception? Yes or no. And we want a hundred percent confidence in that answer. Well, we may want to have some sort of language in our audit report that says we did this and we have, say, 90% confidence in these numbers. You can run statistical scores on your models and get, what you'd call, the adjusted R squared to see, you know, how well is my model performing? What is the error rate? You know, those types of things. We're not there yet, but we're considering how do we articulate this in an audit sense where we're not saying it's an absolute, but we still fulfill those audit obligations to say what our conclusions are. So it's kind of a hard juggling act.

Yusuf:

What would you say the next step in that particular area would be for you? So going from where that outcome was and where you'd like it to be to be able to, properly produce a result and explain it.

Michael:

Current our current intern, she wrote this, sentiment analysis model. And if you would asked her, she's probably frustrated because really she wanted to spend more time to get to a more accurate, dependable model. But at the same time, we had to balance the needs of our audit project team to get them what they needed in the timeline that they did. But to me it's like, no, no, this is good. Like, all right, maybe we didn't get exactly what we wanted for this particular project. But now we have this model that we can use for future projects. I just asked her today, hey, spend some time over the next month. Indeed now training up the model and testing it so that we can have that in our back pockets. What I've been finding in this work is scale kind of starts to become exponential. When we run this program for one project, we learned from it and be able to apply it to another project where now, if we do another survey for a project, we can easily run the responses through that same program. And we can have, we at least know what our kind of confidence rating is. And we just keep learning from there. We keep adjusting. The more surveys we sent through it, the smarter the model will be. And that way, you know, we're kind of mitigating that. We're actually making progress and getting more reliable results. It's going to take time. It's going to take iterations to get there.

Conor:

One of the important things there, in that approach, is if there's a real experimental mindset or almost some allowance or some acknowledgement that you won't get things right every time or first time. And it might take a little bit of time to actually start delivering on the results. Can you talk to us a little bit about the importance or how you went about getting that leadership buy-in to allow that experimentation to happen?

Michael:

I think a lot of it again, I got lucky. I think we have pretty good tone at the top you know, my VP and Director The tone at the top has been data analytics needs to be deployed for every audit project. That's really been the message from the top and that's paved the way for us. So anything in audit, anything in the business world, you need that tone at the top. How did he get that sold to his superiors? I think that people are open to new ideas. I think here I've had this change happened to slowly. Yes. But people do want to explore what are the current trends in the marketplace? And so that's our culture here. That's probably the most important, factor in our success so far.

Yusuf:

You spoke about key challenges. What do you see as the key opportunities for you and your team and the broader internal audit team over the next few years in using data?

Michael:

Oh man.There's a lot of opportunities here. So this is what's really exciting to me is everyone that I've talked to across many channels, many industries, many functions, many companies. Everyone seems to be on this wave of trying to get a better handle on all of their data sources and trying to streamline and warehouse that data in a cohesive manner. That's where I think everyone has opportunity and namely our data analytics function within our department. Right now, what it takes is we need to work with our audit teams to get, we're usually getting flat files to do a lot of this analysis. And that's probably fine. If we were connecting to directly to some databases or data warehouses to run automated workflows, you run the risk that you might miss something with your maintenance and something could go wrong and give you an inaccurate result. If you kind of have this control in place that says, deliver said report to me on a cadence basis. And we all agree that this is the report that we all against a single source of truth type thing; we can rely on our change management controls to say: if this report does change, we know about it so that we can adjust our workflows accordingly. That's actually a pretty sound process where we can maintain our workflows and programs pretty well. Now, when it comes to how that applies to our entire department. We need to have a better sense for what is that system of inventory or inventory of systems that the enterprise uses?

Conor:

You're obviously very passionate and feel strongly about using data, not just in internal audit, but going forward in your own professional life. Is there a community of practice or a group of like-minded individuals that you tap into to learn from, or share experiences or anything like that?

Michael:

It's interesting, you ask that. Because I just got invited on LinkedIn. Somebody reached out to me last week about this program that sets up these random, video discussions with other data,-interested people around the world. I keep in touch with a lot of people that I have known in previous roles specifically on internal audit, operational financial roles, and they've moved on to other organizations or other departments. And I just keep in touch with them because honestly, everyone I've talked to, whether they're in IT, finance, marketing: everyone is starting to learn how to use data because they're like, oh, we can actually get a lot of value out of this. So I find a lot of value in just keeping up with people that I've worked with in the past. Hey, what are you doing with data these days? And then a lot of it is supplemented by my Master's program. I'm taking discrete mathematics right now. And as a class, we're having debates of things as abstract as"Was mathematics invented or discovered?" And I liked that kind of stuff. So there's people out there that are really interested in this stuff and I consider myself to be one.

Conor:

What does the next five years hold for you?

Michael:

Over the next five years, I do plan to get some sort of role within the business, where again, I think that if I ever want to come back and audit and move up the chain, I need to get that experience in the business. And that's kind of where I see my next step is maybe getting into some sort of like data engineering role, but it gets more hands-on experience of managing these databases to get said insights. How do you balance a structured database for structured data compared to unstructured data? How do you balance that with how much would it cost? How do you have all these different systems talk to each other? That's a really interesting challenge to me. And I think it's a question that everyone's kind of asking right now and I want to get in on that work. After five years, I'll be done with my artificial intelligence degree. And then who knows? Who knows what five years is going to look like from now. Because things seem to change pretty rapidly these days.

Conor:

So as a data-focused internal auditor, what are some of the key things you've learned or developed in that realm that are going to hold you in good stead in the future?

Michael:

For internal audit, specifically what it's really taught me, is how do you bring together the people that know the business and the people that know the data and get them to agree on something? That has been a huge challenge. And I wonder why is that? And I think it's because the people that are really good with the data don't always think about it. They're so passionate about it. They so much enjoy just looking at a data set and saying, oh, I could do it this way, or I could do it that the way. They often don't think like what's really going to drive the bottom line here? They're not thinking of what the investment and dollars are. They're just thinking about what seems cool. And I've kind of learned to be, taking that step back and start with the customer need first. That's just something from an old Steve Jobs interview. And he said, when they first were starting up, they were like, oh, we can offer this really cool product because it does X. And they struggled at first because they weren't listening to their customers. Let's start with that and work backwards from there. Start with what the customers want and then build the technology, according to that.

Conor:

That's good advice.

Yusuf:

If you had to choose one project that you've been involved in that used data within audit and that produced a good result, what would you say the key success factor for that would have been?

Michael:

Glad you asked. I think it is kind of what I just said is understanding your customer's needs and aligning on what those expectations are. That's been the biggest success factor so far. And I mentioned my passion about data governance in our pre-call. We're running a data governance project right now, so, okay. What are all the tables out in our databases that we really care about that has sensitive data customer, private information, financially relevant information: how do we want to agree upon with our stakeholders? What are all the tables that we indeed want to look at? Things like is the records retention policy being followed? Are there records out in our databases that are 10, 20, 30 years old? How many records are that old? How many are in non-compliance with some of our other policies? Getting all of that defined. All those requirements defined and agreed upon is vital.

Yusuf:

What's the best way to find you and connect with you, Michael.

Michael:

Find me on LinkedIn. That's probably the best spot and I'm pretty active on there, so I'm on there usually every day. Because that's where I've found the most kind of, as you asked before, those like-minded individuals. And I follow a lot of data science channels on there. There's one channel in particular. This is for any nerds out there like me. It's called Towards Data Science. A lot of technology channels out there are very high level and can be a little trendy and buzzwordy. And I don't get a lot of value out of those. So I really try and find the technical ones that talk about the approach and process some companies are using for whatever type of analysis. And that's one that I get a lot of value. There's a lot of others out there, too. But for me personally, just find me on LinkedIn.

Yusuf:

We'll put that up in the show notes. Michael, thank you for joining us today. Lots of good insights there.

Michael:

My honor, really appreciate talking to you guys today.

Narrator:

If you enjoyed this podcast, please share with a friend and rate us in your podcast app. For immediate notification of new episodes, you can subscribe at assuranceshow.com. The link is in the show notes.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Lunchtime BABLing with Dr. Shea Brown Artwork

Lunchtime BABLing with Dr. Shea Brown

Babl AI, Jeffery Recker, Shea Brown
2Bobs—with David C. Baker and Blair Enns Artwork

2Bobs—with David C. Baker and Blair Enns

David C. Baker and Blair Enns
Ditching Hourly Artwork

Ditching Hourly

Jonathan Stark
The Business of Authority Artwork

The Business of Authority

Jonathan Stark and Rochelle Moulton