Auditing with data: for Performance Auditors and Internal Auditors that use (or want to use) data

36. Why the audit objective should drive our use of data

Risk Insights - Conor McGarrity and Yusuf Moolla Season 1 Episode 36

In this episode, we discuss why it is important to focus on our audit objective when using data analysis in our audit.

We explore:

  1. Why a focus on the objective helps us measure success
  2. Why we should avoid rules based analysis
  3. How a focus on the objective helps us stay true to our audit mandate
  4. Cost benefit analysis when we find things that are not related to the audit
  5. Why a focus on the objective helps us in reporting on data quality or missing data

This is explored in more depth in our book, here: https://riskinsights.com.au/the-data-confident-internal-auditor


About this podcast
The podcast for performance auditors and internal auditors that use (or want to use) data.
Hosted by Conor McGarrity and Yusuf Moolla.
Produced by Risk Insights (riskinsights.com.au).

Narrator:

You're listening to The Assurance Show. The podcast for performance auditors and internal auditors that focuses on data and risk. Your hosts are Conor McGarrity and Yusuf Moolla.

Conor:

Conversation today is going to be about the importance of focusing on our objective for any data analysis we're doing as part of our audits.

Yusuf:

We've got five areas that we'll focus on today. And why we think about focusing on the objective upfront and developing our hypothesis upfront, when we're looking to use data on any audit.

Conor:

How will we know when we're actually successful when we've used data in our performance audits or internal audits, what does that success look like?

Yusuf:

So Alli Torban, the data visualization designer who was on the show, she recently put out a podcast episode talking about how to measure success of your dataviz. The first thing she was talking about was defining success based on the objective that's been identified. So what is it that you're looking to achieve with your data visualization, and then you define your success along those lines. So If you don't know what your objective is when you start. What is it that we're looking to achieve overall? And how does that link to the audit work that we're doing and what are the hypotheses that we're trying to either prove or disprove? If you don't have that, how are you going to measure success? If you don't know what your aim is, what your target is, why it is that you're doing what you're doing. How are you going to determine at the end, whether you've been successful? And sometimes we fail. And failure is fine. We learn from failure. In fact, if we're not experimenting and not failing, at least a few times a year with our audits, then we're probably not very successful because part of the sort of success metrics for using new emerging technologies should be that there will be a level of failure. If you're not failing, you're either spending too much money or you're not being adventurous enough in terms of how you do what you do. But when it comes to what it is that we're looking to achieve, what the objective is, how that links to the audit mandate, that needs to be quite clear. We then define success based on our ability to meet that objective. And all the way in the end, we would define success based on our audience's ability to understand what we're saying.

Conor:

And so there's a real hierarchy here. So as auditors, we need to understand what's the objective or the mission or the strategy of our organization. How do our audits flow from that or contribute towards that overall organizational objective? And then within our audits, how do we make sure that we've got a proper objective for how we're going to do our data analysis for those audits?

Yusuf:

We are helping to enable that strategy to either be achieved or enabling the prevention of that strategy, not being achieved, if that makes sense. So either way. So we always looking to both enhance and protect our organizations. As auditor's, that's what we do within internal audit. Within performance audit, we ensure that government is achieving efficiency, economy, effectiveness, compliance. The link between overall strategy, the individual audits that we're doing and then the data work that we're doing. That link gets broken if we don't take an objective based, hypothesis led approach to the way in which we use data on our audits.

Conor:

Okay, so that's number one. So to ensure success of our analysis, we need to have that objective clear from the outset. What's the second thing that we need to be conscious of in order to make sure that we've got a proper focus on objective.

Yusuf:

One of the things that we see quite often, and we've spoken about this so many times before, but it's worth continuing to mention 'cause we continue to hear people asking for it. Just the other day it happened again, you know, there's always this, what rules are you gonna run? Or what tests are you gonna use or what compendium of listings of tests do you have?

Conor:

Tell us your libraries. What libraries are you using?

Yusuf:

What libraries are using. That's right. So look, the challenge with that is if you've been using data on your audits for a while, or if you're just starting to use it regardless, you will find, we've all found that we get to the end of a rules-based type analysis exercise. And then we ask ourselves, okay, so we found this, what does that mean? How important is this? What's the level of risk that we're facing here? Is there any risk at all? Do we need to discard these results? Do we actually want to go and do any more with this? And that's because we haven't actually started off with what it is that we're trying to achieve. We're starting off with what we've seen in a forum or on some, I don't know, software suppliers website, or that somebody has handed it to us in an Excel spreadsheet Some of these things actually come from financial audit risks, or a lot of them come from financial audit risks. And in internal audit performance audit, we're not looking at, we're generally not looking at financial audit type risks. And so what then happens is that we get to the end, we either discarding results, or we trying to explain why they important or we're sending people down rabbit holes trying to fix things that aren't really that important. And it's because we haven't started with the overall objective for the audit, which like you said, is linked to the overall strategy of the organization. Why do we want to put ourselves in that position? Why do we want to look like stupid auditors that get to the end and say, I'm sorry, I don't really know why this is important, but I had to do it because there was a test in the test library. Let's just stop doing that. I'm sure you can hear that it's quite frustrating to me. But it's one of those things that never want to hear somebody ask again, what's the library, what's the catalogue, because it just doesn't make sense. It just doesn't help the way in which people view you as an auditor. And it doesn't help your own internal thinking around getting to the root cause of challenges or situations that you can help fix for the organization.

Conor:

If you're about to start an audit and you're thinking about employing some libraries that you've come across, ask yourself, why am I using this? Is there sound logic to use this test that already exists? How does this contribute to my objective? And if you can't come up with a decent rationale to explain that, then stay clear. That's number two. So what's number three we need to be thinking about?

Yusuf:

So, really simple. Are we meeting our audit mandate. Again, linked to the previous two. Our audit mandate would be something that we have defined along with the audit committee. The problem here is that. We all have limited resources, there's limited time available within teams. And so if we're not laser focused on the issues that we need to be dealing with to achieve what we need to achieve overall, as part of our mandate, and then as part of our plan that flows from that mandate. We're then taking away from the work that we really should be able to do. So If we have 2000 hours in a year as individuals and we are a team of 10 and we have 20,000 hours to execute overall. Then any time that we spend using tests from libraries, et cetera. They take away from our ability look at key items that we need to be looking at to meet that objective. So again, breaking that link means that we're not. Able to achieve our mandate and that's a big problem.

Conor:

I might draw a little bit of a distinction here between performance audit and internal audit. Performance audit mandate is probably a little bit. More difficult in the sense that there may be more wriggle room for performance auditors as part of their audit planning to actually do a little bit of data discovery or try and understand where the data sources are. So there is, some more capacity to spend some resources and hours to understand that, but that being said, it doesn't take away from the fact that you still need to make sure once you're fairly settled on your objective for your data analysis that you need to make sure that is clear before you start looking at it in detail.

Yusuf:

Maybe let's unpack that a little bit. Cause that's important. Right? So this doesn't take away from. The need to understand the data sources and where they exist and what they look like. However, you're not. automatically going to be looking at every data set that exists within the public sector, within, your state or your region when you're doing that. So usually you would have chosen a topic or you would have some sort of topic that you are looking to evaluate, and there would be an angle for that topic. So it will either be efficiency or effectiveness or compliance. Oh economy and, or maybe, a combination of those. And so when you looking for that data, you have that in mind. So the objective is to determine effectiveness of, the way in which principals manage performance within the schools or the efficiency with which certain services are delivered within the public sector. knowing that that is what you are angling towards, then helps you determine where the data actually exists. And then you use that to determine whether the hypothesis that you are going to come up with can be answered through the use of data or not. Is that fair to say?

Conor:

Yeah, absolutely.

Yusuf:

Using an objective focused and hypothesis focused approach, that's primarily thinking about what analysis you're going to do once you understand and have the data that you're then going to use. So what is possible, right? And then you take the, what is possible. And link it up with a hypothesis that you're trying to answer or prove or disprove and then work out what is it that I'm actually going to do now? So it's at that point of what is it that I'm actually going to do now that you need to have. Understood what the hypotheses actually are. So exploring the data, looking at an initial view, understanding where all the data is that still has to be done. You can't really get away from that unless you have a very clear, very small mandate and you know exactly what it is that you're going to be looking at. That discovery will vary in the internal audit land. There'll be probably a little bit less of that sort of discovery of the different datasets. You still want to do it. You definitely want to do it because you want to know where everything is and what can be answered. But what we're talking about here is more, once you understand all of those items and then determine what analysis am I going to do? That's when the difference kicks in.

Conor:

And the thing to resist is when you're going through the data at an initial first sweep, and things that to, you may be interesting, you have to. Bring yourself back to is this within my mandate? Is this within the objective of the audit? And if you can't reasonably answer those questions, then maybe the interesting thing needs to be set aside for another time.

Yusuf:

In that situation, you've probably got three ways in which you can go about it. The first is, yes, this is interesting to me. It aligns with the objective and the hypothesis that we have, I'm going to go and do something with it. Or two, it doesn't align with the immediate objective, but it does align with my audit mandate more broadly. So I'll set it aside for a different audit which may or may not be on the plan, right? So you may not have an audit in the plan, but you know, that there's this thing that I need to answer. Depending on how critical or important or risky it may be. You may look at it straight away or try to figure out whether there's something that you need to look at right now or set it aside for later. And the last is this isn't important to me at all. It doesn't fall within anything that I do or that I need to do or that I need to explain. It isn't important for the organization that I'm working with or working for. So I'm going to set this aside. Obviously you'll talk to relevant people and then de-risk that and move it aside. But yeah, there's a few ways to to focus on that.

Conor:

Excellent. I think you've covered all those scenarios there really well. Okay, so that was number three. So making sure that we're staying true to the audit mandate and the objective and hypotheses that sit within that. Number four.

Yusuf:

Often as auditors we want to take an exploratory approach to what we're doing. So we may have a broad objective and our broad hypothesis in mind, and we want to know whether we can achieve that. And then we look at some data and we find some things that are not related, but that may be important. The question we need to ask ourselves is what's the cost benefit of not having structure in conducting that analysis. So the analysis that isn't defined by an objective would be exploratory. So we look at some data, we find something we then start to go deeper and deeper into what that particular thing may be. It may be a fraud situation, or it may be some missing data. There's various things that we can put into this bucket. So the question is what's the cost. How much of time is this going to take me to evaluate both technically and in terms of getting an understanding of the particular matter at hand and what's the benefit. So are we losing money as an organization? Is there potential for fraud- now fraud, usually you don't do much cost benefit analysis. You usually want to just uncover fraud. But when it comes to things like loss or slight deficiencies in the way in which we collect money, et cetera, how long is it going to take you to evaluate this? And should you be doing this as part of the audit work that you do? Sometimes if you find something like that and I'm talking non fraud, now find something like that. The better thing to do would not be to try to evaluate it yourself and go all the way down to understand exactly how it works. But to pass that back off to the business or the agency that you're working with and say, Hey, I found this, can you please explain to me. Whether this is an important thing or not, and how important it is and what the answer is because they may already have the answer to they may already have evaluated it. Particularly if we're looking at data cold where we haven't actually gone to understand the way in which the business works.

Conor:

I think sometimes it's useful to have that benefit. Like you said, conversation with the business itself. And maybe because you might not be able to put a figure on how many hours or what the cost will be to actually come up with something meaningful, maybe focus on the benefit side of it and say like you say, go back to the business and say what's the benefit. If I continue down this path and this analysis to you, rather than trying to quantify any sort of inputs that you need to do on your side,

Yusuf:

Think about something like stock. Losses and shrinkage within stores as an example, really simple example, but most. Large and medium sized organizations have some sort of threshold in place and allowable error limit. And so they may accept, 0.25% of shrinkage as an acceptable limit. If you seeing 0.1% of shrinkage and you think this is important, first go and ask somebody, how important is this? How much shrinkage do we allow? Because you may actually spend hours and hours working out that you've lost a dollar for every 10,000 and then somebody tells you, but we are willing to give up 250 for every 10,000, because it just costs too much to try to track it down and then you've wasted all your time. So that's the sort of thing that we're talking about.

Conor:

Fantastic. Great example there that most of us find throughout our working lives actually. Okay. What's the last thing, we need to be thinking about

Yusuf:

This is a tricky one because this sometimes is pre analysis and sometimes it's pre data access. but let's just focus on the pre analysis piece. If we can't find the data that we need to conduct a particular test, or we find that the data that we are looking to use is deficient. If we haven't taken an objective hypothesis based approach, we don't know whether we should be going to look for that data or suggesting that the data should exist or suggesting that the data should be cleansed in the case of, data that's missing a data that isn't very clean. you can't then answer the question. How important is it to get this data? If you haven't determined upfront, what is the objective? And how am I going to use data to achieve that objective? sometimes we'll find that a field is missing as an example, and that field is not completed because within an application team or within a business team, they've determined that it just costs too much to. Capture that data and it doesn't actually lead to any improvements in the way in which objectives are achieved or successes measured. And so they don't capture it. Now, you don't know that if you're just looking at data cold the other part of that is you don't know if you should be asking for certain data to be captured or certain datasets to be captured. If it isn't directly linked to an objective or a hypothesis that you're testing. So you don't know if it should exist, you don't know if it's going to be useful and you can't have the conversation around that. And you can't have the determination as to whether to go to look for it. If you haven't started off at the correct place. So this is the challenge that this comes from. the idea for this particular item is, based on experience, but it's reared its head again recently because there's a lot of situations we've encountered over the last six months where we just haven't been able to get to data. And in some of those situations, We didn't need the data or the data wasn't actually going to help achieve an objective. And so you need to come to that determination quickly. Otherwise you could spend hours and hours going to look for something and not being sure where they should exist. So similar to what we said, how important is the result? Similar to that. How important is this data or the existence of this data or the cleanliness of this data? And you can only determine that if you know that you are linking your work to a strategic objective. So that link again is really important.

Conor:

And sometimes it's useful to put yourself in the shoes of the senior executive business owner or even the CEO, because you cannot hope to have a compelling rationale for why data needs to be captured or needs to be better cleansed. If like you said, you're not able to explain. The objective that you had in mind before you even commenced down your data analysis path. So you're really doing yourself a disservice there as an auditor.

Yusuf:

That's right. Yeah. What is the cost benefit to the organization to capture, hold and cleanse the data that you found or didn't find.

Conor:

That rounds us off. There are five really important things we need to think about that contribute to our focus on the objective and hypotheses we rely on before we commence our data analysis for audits. The first one what are the signs of success. And it's really having that crystallized objective from the outset. The second one some of the dangers with relying on rules based tests and things that have been gone before and why, it's important for us to not focus on what's available to us in terms of rules, but what are the things we need to do? And what does the analysis we need to undertake that actually go to the heart of our objective. So there's a clear linkage there. Thirdly the importance of, staying true to our audit mandate, staying true to that objective that we just described and also staying true to the hypothesis. Now, that being said, there's always room for us to revisit those in an iterative process. And as we see more and more agility in auditing, that becomes even more important, but we still need that clear focus on those three things. So fourthly, there, we talked about the importance of putting your cost benefit hat on. What's the cost of us exploring the data and actually trying to figure out what's in there versus what's the benefit to the organization of that process and making sure that we have those considerations front of mind. And lastly, when we're going, looking for data, is it available? Should it be available? How clean is it? Do we need to clean it? And again, this is with our cost benefit lens on and the importance of stepping through those key considerations.

Yusuf:

You've got an overall strategic objective, your audit mandate and audit plan and audit focus is then linked to that to make sure that you're helping to protect and create value. That objective that you create and the hypothesis that you create in terms of using data as part of an audit, will link to that. When you break that link, everything else falls down. So don't break that link.

Conor:

I think that's sound advice and a good summary. Great conversation. Good speaking with you Yusuf and we'll catch up soon.

Yusuf:

Thanks Conor.

Conor:

Cheers.

Narrator:

If you enjoyed this podcast, please share with a friend and rate us in your podcast app. For immediate notification of new episodes, you can subscribe at assuranceshow.com The link is in the show notes.

People on this episode

Podcasts we love

Check out these other fine podcasts recommended by us, not an algorithm.

Lunchtime BABLing with Dr. Shea Brown Artwork

Lunchtime BABLing with Dr. Shea Brown

Babl AI, Jeffery Recker, Shea Brown
2Bobs—with David C. Baker and Blair Enns Artwork

2Bobs—with David C. Baker and Blair Enns

David C. Baker and Blair Enns
Ditching Hourly Artwork

Ditching Hourly

Jonathan Stark
The Business of Authority Artwork

The Business of Authority

Jonathan Stark and Rochelle Moulton