
Auditing with data: for Performance Auditors and Internal Auditors that use (or want to use) data
The podcast for performance auditors and internal auditors that use (or want to use) data. Produced by Risk Insights.
Auditing with data: for Performance Auditors and Internal Auditors that use (or want to use) data
35. Stephen McAteer, Performance Audit Data Scientist at VAGO
Stephen leads the data science team within Performance Audit at VAGO, the Victorian Auditor General's Office. We discuss:
- What VAGO does
- Stephen's approach to helping auditors use data
- Why it's important for all performance auditors to have data skills
- How VAGO has used open data to improve transparency
You can reach out to Stephen via LinkedIn (www.linkedin.com/in/smcateer).
Links
- VAGO Audit Report – Measuring and reporting on service delivery with interactive dashboard for exploring Victorian department performance (www.audit.vic.gov.au/report/measuring-and-reporting-service-delivery)
- VAGO Audit Report – Accessibility of Tram Services with interactive map on the accessibility of Melbourne’s tram network (www.audit.vic.gov.au/report/accessibility-tram-services)
- VAGO Audit Report – Council Libraries which uses Data Envelopment Analysis (DEA) to evaluate efficiency (www.audit.vic.gov.au/report/council-libraries)
About this podcast
The podcast for performance auditors and internal auditors that use (or want to use) data.
Hosted by Conor McGarrity and Yusuf Moolla.
Produced by Risk Insights (riskinsights.com.au).
You're listening to the Assurance Show. The podcast for performance auditors and internal auditors that focuses on data and risk. Your hosts are Conor McGarrity and Yusuf Moolla.
Conor:Today we've got Stephen McAteer, the lead data scientist from the Victorian Auditor General's Office. Welcome Stephen.
Stephen:Yeah, thanks for having me on here, guys.
Conor:Can you kick us off with a brief history of your work to date?
Stephen:Yeah. Yeah, for sure. I came into the discipline through physics, studied physics at university. From there landed in operations research at the Department of Defense here in Australia. From there went through a couple of commercial roles, through Telstra and Australia Post. And ended up in this role the main thing that attracted me to this is that you're directly working for the good of society, good of Victorians. What it is that we do at VAGO so for people who don't know, government audit like this has two big parts, which is financial audit and performance audit. VAGO is a little bit unique in that data science, data function, data analytics function is broken up into two teams, one on the financial audit side, and one of the performance audit side. The team I head up is on the performance audit side. What that's concerned with is, straight out of the Act that we operate under, to assess the economy, efficiency and effectiveness of government departments. So a pretty broad mandate, pretty hard to pin down at times, that makes the work pretty exciting.
Conor:So how does a performance auditor work with a data scientist?
Stephen:I've been in the role for about a year and a half now. I'm pretty new to performance audit. Hadn't had a whole lot to do with it, up until that point in time. And I think it's pretty safe to say that it's not a well-established art. But the way that I come at it is that, what you're trying to do is a risk assessment. Performance audit is an external version of that third line of defence. So you've got a risk assessment that happens in the business unit itself, internal to an organization. You have a central risk function which makes sure that the risk culture is operating well within the organization. And then you'll have internal and external audit. So that's where we come in and our job is to make sure that risk function is operating properly. So we use that in our day to day approach. So our planning phase of an audit is actually called the risk assessment phase, that's what we call it internally. The conduct phase which we call internally the risk response phase. And then reporting is reporting. That's what guides us, we're in that early stage we're really looking at, where are the risks? And the work that we actually do in the audit is guided by where we assess where those risks are. So in terms of, data science, at the risk of demystifying the whole field and putting myself out of work, it's essentially just another type of evidence. in performance audits, use interview evidence, you have documents, various types of evidence. Sometimes that evidence comes in the form of tables or gigs of data. When it gets into that realm, it becomes hard for the rank and file performance auditor to deal with that. So that's where we come in. And it's really just that. It's how can we understand the risk that exists at the agency that we're dealing with, using all the evidence that's available. If you don't have people who can interrogate the data, analyze the data, there's evidence that's going to be missing there, and you're not going to get the full picture. Performance auditing standard's really explicit about the way that you deal with evidence, the way that you use evidence to contribute to your findings and all that kind of stuff. And we're guided by that. It's just that technically there's different techniques involved.
Conor:Obviously data is an important domain for evidentiary material, as you've just described. Does your team ever get involved earlier on as part of the scoping of particular audits, to try and understand what's possible perhaps with data?
Stephen:Yeah, yeah, for sure. So we have a planning cycle that sort of goes through every year and has a rolling three year plan looking forward into the future. And again, works on a risk basis as well. We look at all the different possible audits that we could do, and we try and remove as much uncertainty as we can, from our understanding of the risk that exists at the agencies. So to the extent that we can use data to inform that we do. It's interesting because it's one of these areas where there's a lot of judgment needed when you're doing annual planning, when you're playing what things you need to look at. There's a lot of qualitative sort of assessments. You need a team on the performance of the side itself, the organic team, that has a really good understanding of the domain, of how the agency's operating, the environment it's operating in. And so a lot of that assessment is inevitably going to be using soft skills and using qualitative sort of assessments. But that said there's areas where we can start going in and informing that. One of the interesting things is that, under a recent revision to the act, we've now got a power to do what are called limited assurance reviews. So that allows us with a lot less overhead to be able to reach out to an agency, ask for some set of data. It's not limited to data analysis, but we can use it in this way. Have a very quick look at a piece of evidence, come to some sort of an understanding of the state that's in, and use that to guide the annual planning. Of course, there's also publicly available data. One of the pieces of work that we've done recently is we've got a dashboard that we've put up, if you drill through the VAGO website you'll be able to find it, displaying the measures that the departments use to report their performance. So each agency or each department gets a bucket of money, but that bucket of money is tied to particular outputs and those outputs have very specific measures against them. It's great that we have a performance reporting framework in the government. But all that has been buried in spreadsheets, on the treasury and finance website, and really hard to interrogate and really hard to look across years and all that kind of stuff. The information was all there, but the piece of work that we did was basically put it in a way which is really easy to understand and navigate. So increasing the transparency of the information, and that's something which is certainly going to feed into our annual planning. So at a glance, the audit teams, the sector teams, who are involved in that department will be able to browse through all those different measures. Look at the ones that are sliding, look at the ones that are performing well. And use that as a way of guiding, where we might want to put attention. Of course, with COVID, all the data from this year looks a little bit strange. We were doing the quality assurance on that data, making sure that we were moving things across from those spreadsheets into our data set properly. And there was a lot of very unusual numbers this year that were true, that were actually the right numbers. I think that's one of the things that's going to be really interesting in the coming couple of years. A lot of these normal patterns that you expect to see are going to be very strange for a while.
Conor:You've hit upon a couple of important concepts there, and one of them being transparency and bringing all that disparate data together and being able to tell a story in a way that's digestible. That's obviously good for the normal punter on the street to see how services are being delivered, but something you just mentioned there, was the added benefit of the agencies themselves having more of a consolidated view about their performance. What sort of feedback have you had from those agencies about VAGO taking on that role or perhaps helping them out, to some extent, run their own business.
Stephen:It's interesting because, when you start talking about things like this. We operate under the Act. There's an Act that stands us up as a statutory body, and really that Act defines what it is that we do. There's obviously a whole bunch of interpretation around that. But transparency isn't necessarily our central role. So it's an interesting conversation to have and it might be different in different audit offices, in different jurisdictions, working under different acts. But exactly where the boundary is of what it is that you do is a really interesting conversation. Of course, transparency allows the public to look into the agencies and allows them to see how it is that they're performing. It's almost a secondary sort of function. So our primary function is to look into the, it's in the Act, the economy, efficiency and effectiveness of the agencies. And of course, when we report on that, we're putting it into the public view. So transparency is, it's in there, it's in what we do. I think one of the things that's interesting, particularly about that performance measures dashboard that we did, one of the primary audiences of that, of course like you mentioned, is the public. The departments themselves showed quite a bit of interest in taking carriage of that. So once we've shown the way that it can be done, it's really in Victorian Treasury and Finance's area to provide that kind of transparency into those measures. They're the ones who publish the spreadsheet versions of this data. So in some ways we're showing how and the nice thing would be for those agencies to take carriage of that. There's an example of that, there's an ICT dashboard, and it provides basically an overview of the ICT projects that are operating in Victoria, whether they're red, amber, green and all that kind of stuff. Which was originally based off a VAGO report and dashboard that we produced. And then they took carriage of it. Our Auditor General, Andrew Greaves, is very keen on us providing value back to the agencies in that way, showing them how. We're always very keen to hand over any IP that we generate to departments if they want to take it and use it. Any code that we use and all that kind of stuff, we're very keen on handing over. One of the other big audiences of this is the parliamentary committee that we report to, which is PAEC. This dashboard will allow them to look through those measures and understand exactly what's going on in the departments in a way that they wouldn't be able to do, just by looking at those spreadsheets. So you provide the product to multiple audiences potentially. And I think that's one of the things that's really important to get, when you're producing a piece of communication like that, is really understanding who is reading it and what it is that you're expecting them get out of it.
Yusuf:That's an important point. And obviously we've seen improvements in that over the years from audit offices where we'd have dashboards going up just for the sake of putting dashboards up, don't always necessarily know who the audience is. But how do you get that balance of three sets of stakeholders, really, so you've got the public broadly, you've got the individual agencies that you're working with or even others that may benefit, and then you've got the parliamentary committee. How do you balance the needs when you are producing something like that?
Stephen:It's like any piece of communication. I guess one of the things I rely alot on is that your typical performance auditor has maybe an arts or a law type background. Typically very good at communication, very good at understanding how they produce these written reports. I think my big thing is trying to leverage that and get the communication of our stuff, in the same way that any other piece of communication is produced in a very tight and well appreciated way, in performance audit in general. So again, like we were saying before, about it being just another type of evidence, the dashboards, the graphs, the charts, the sentence that you put in the sentences, like numbers in the report. It's just another piece of communication and everything's subservient to the needs. And in the same way that in a performance audit report that you produce, there'll be sections, which talk to particular audiences. There's a summary section that sort of sits at the front, you're expecting, almost everyone to be able to skim through that and get a decent appreciation for what it was all about and what we found and, what they need to know. And then as you drill down, you get more technical parts that maybe not everybody would be interested in. And then you've got annexes sitting at the end of that. Very few people perhaps would ever get that deep into the report. So I think it's just about that, you understand who all those different audiences are. And the reality is we're talking to public, we're talking to departments, we're talking to PAEC, we're talking to rank and file working staff in departments. We're talking to senior levels in departments as well, who need very different things out of this as well. You can communicate in multiple ways. So if that dashboard that we produced, I think the way that we produced it works reasonably well for public and PAEC consumption and the departments as well. But if we struggled to produce something which was going to be suitable for all those audiences, then the conversation will be well, do we produce multiple things does it become some sort of chart or some sort of other method of communication that lands in the report rather than being a dashboard? Maybe we're giving a presentation to the departments or people within the departments and you communicate that way. So you just have to really tailor the communication to the message and making sure that you're getting it across, using the right methodology. Again, these are skills which we've got an in abundance in performance audit, because that's the kind of people that you tend to be dealing with. Those communication skills, and which can sometimes be a bit lacking on the more techie side of things, on my side of the fence. I think it's just about leveraging off each other's skills, and pushing for that sort of shared goal as much as you can.
Conor:You mentioned, there are a few times now, Stephen, the data is just another source of evidence, which is very important, but you also said that data analysis can't be done in isolation because there's a lot of contextual information about the audited entities that needs to be brought on board. You need to make a lot of judgments about what's important and risks for those particular agencies or programs. Our discussions with some other audit offices seem to suggest that combination or coming together of the strict data people and the audit teams themselves is not always an easy path to tread. What's worked well for you in trying to bring those skill sets together?
Stephen:I definitely appreciate it's a bit of a challenge. I think it's a challenge in performance audit in general. So one of the big stages in a performance audit is, where you get up to a sufficient level of expertise to be able to make sensible judgments about the topic. It is a really hard section of any performance audit, a really exciting part as well, because you get to learn all this new stuff and you get to understand this whole new area. I think performance auditors love this, but it's also very hard work, something that you have to take really diligently. And the challenge for us as a data science team is that, unlike the performance auditor, we're not just on that one audit. We'll typically be working on three or four of them at a given time. We have to get up to that level of expertise, to some degree, we're also relying on the team providing us with that context to some extent, because we just can't be as deep into it as they are, under the current model. So a couple of things that we're doing about that. The first thing is that, in our process, one of the really early stages is that we get a member of the data team attached to the audit team. Now we're not going to be there for every single meeting and doing every piece of work that the audit team are doing. But at least we're there on those very early days when the lines of inquiry and the criteria that the audit are going to be built up on, are still being formed. So being part of those early conversations and really understanding what's of interest to the audit. You can do two things. You are, you really understand where those lines of inquiry are coming from how they're important and how they contribute to the overall objective of the audit. But you can also shape them a little bit. So by getting in there early and doing these, pinch-hitting quick pieces of data analytics, you might pull down a quick spreadsheet, run a quick calculation and say, Hey, look, this number looks really big, this number looks really small, whatever. You can actually maybe start to shape those lines of inquiry and criteria, which is that risk assessment phase. And it's really what you're bedding down exactly what, what the testing is that you're going to do. So that's one thing, just get in early, get in as quick as you can. And I'm talking to our sector heads, performance audit team are broken into sectors, education, environment. So I'm having conversations with them every couple of weeks, scanning that horizon. What are the audits that are coming up? Do you think there's going to be much data work in that? When they say no, I say maybe actually there is. And just really having that sort of really early conversations that we know what's coming up as it's coming up. The worst thing that can happen is that somehow one of those audits gets away from you, and the next thing you hear about it is planning is finished. And that's a really nightmare sort of scenario because you find yourself having to backtrack and say, no, actually hang on, there is a whole bunch of stuff that you can do, and there's a lot of missed opportunities. The other thing that we're trying to do to address that as well, which assistant auditor general, the head of performance audit, Renee Cassidy, is really keen on, is that at least some degree of basic data skills are just part of what all the performance auditors will have. They might not have all the skills and be able to do all the analysis that the data science team can do. But they'll have enough of an understanding to see what the opportunities are and understand what can be done, and know when to call us. And that question of understanding the context, of understanding exactly what's going on with the audit, you fix that by making the analytics more organic to the team itself. The vision is that we would be reserved then for the really hairy stuff. The data that's really hard to wrangle and the stats that may be a little bit more esoteric and a bit more weird and wonderful. But the run of the mill stuff, the teams really should be able to do. Again, that mantra of it's just another type of evidence. So if your job is to perform analysis of an agency and some of the evidence you're going to come across is coming in the form of data, well, that now becomes just part of your job.
Narrator:The Assurance Show is produced by Risk insights. We work with performance auditors and internal auditors - delivering audits, helping audit teams use data, and coaching auditors to improve their data skills. You can find out more about our work at datainaudit.com. Now, back to the conversation.
Yusuf:That last point that you mentioned there, that's enabling scalability across the team. So you've got performance auditors that you'll want to get to have an understanding of what they could do and, some basic understanding of the way in which data works. But in terms of your team, how do you enable scalability? So the nature of your work is quite unique right? So you've got particular projects, if you like, that are all going to be different throughout the year. And where what you're involved in and what you do and where you source your data from and what the eventual outcome will be is all going to be so different from case to case. How do you enable scalability of your team given that?
Stephen:It's tricky. And so I think one of the things is, aside from anything else, as you look forward through the audit plan, there'll be times where you might get a couple of very data heavy audits sitting right next to each other. And then other times, where the ones that you're working on at the moment just happened to not be data heavy. So part of that becomes more planning. So getting into that really early stage of the annual planning and say, Hey, look, you scheduled those five really heavy ones right next to each other. Maybe if we swap these two, that would allow us to work on them a bit easier. You can always bring in hired guns as consultants to do pieces of work. And we do that on occasion. But the disadvantage is that every time you do that, you miss out on an opportunity to increase your own capability. That said there's points where subject matter expertise is unavoidable and we're not going to be able to get to the level of expertise where we need to be able to do that. Examples are bringing economists in where, the analysis requires a real sort of economics expertise or it might be people with medical data sort of background where those things become very specific skills that, it's not realistic that we're going to be able to have a person with all those different expertise on the team. So hiring people in from the outside is one way of doing that. The fact that the work itself is going to be very esoteric, the fact that the work is going to be very ad hoc, bespoke, every piece of analysis you're going to do is going to be a bit different. That's on one hand, a real difficulty because it basically means you need this really broad skillset and you need to be able to tackle all these different types of analysis and use all these different statistical methods and all these different techniques. The flip side of that is every time you come across one of those things, it's an opportunity to learn some new thing. So I love that. And I think a big part of that and one of the things that we're trying to hire for when we get people into the team, is just that real curiosity and that willingness and eagerness to learn new things and to always be trying to figure out these sort of puzzles. And I'm regularly running into issues where the statistical tests that I know aren't applicable to this scenario. So I have to go and figure out some new tests or some new method to apply to some technique.
Yusuf:Like your first audit, where you had to go and do some DEA.
Stephen:Yeah, that's right, that was an interesting one. So, I mean, that raises a couple of points. So I arrived while this audit was, moving into the reporting phase. So it was quite late on, and there was a piece of analysis that the Auditor-General was pretty keen on getting into that report. And the idea was basically, we're looking at libraries and we're looking at the efficiency of libraries. Efficiency in government is really hard thing to assess sometimes, because you're not operating in a market. So often efficiency comes from, you'll have, different factories producing, similar widgets or different companies producing similar widgets and you're all operating in this economy where you get efficiencies just by, they're not able to get a profitable product into the market at that price. So they're going to fall away. And the ones who are left are going to be more efficient. In government, you don't necessarily have that. You've often got one service provider and what they're providing isn't something that you sell, you're just producing something. So you might be producing audit reports, or you might be servicing patients in the hospital or something like that. Not always something which is necessarily easy to assess using normal methods. So we used this data envelopment analysis technique, it's quite a clever method actually. I encourage people to read into it. I won't ruin the podcast by going into the details here. That'd be the next one, yeah, I'll give a lecture on that. But it's a really interesting technique because what it allows you to do is have inputs and outputs that aren't even in the same units. All you need is that the cost direction is all the same, so higher cost is bad. And the output direction is all the same, those numbers being higher on the output side is good. with a, a couple of other mild sort of assumptions, you can set up a scenario where you can compare entities to each other. So given these inputs, these entities are more efficient in some sense, and you have to put some caveats around that, which we had to work on in the report. But, yeah, that was a technique that I'd never heard of. The Auditor-General had heard of it and was keen on using it. Week one at VAGO for me was figuring out what this technique was and how it works. Luckily I mostly use Python and there's a library for that, so that was handy. Once I had an understanding of how the technique worked, it was relatively straightforward applying it. Set up the data in the right way, shove it into this Python library, and produce the numbers. Which is just amazing because, years ago, of course, you would have to have coded that all up from scratch. But yeah, produced that, got it into the report. I think there ended up being a whole section of the report written about that technique, which was good. And it allowed all the libraries to compare their efficiency to each other, in a way that you couldn't do without using a really sophisticated technique like that. So yeah, that was good. And that's, I guess that's typical. And that's one of the things I love about going to work every day in this sort of environment is that you're thrown these challenges all the time. And rising to that challenge involves learning things and expanding your own knowledge, which is just good fun, you know.
Conor:So you mentioned there, the use of a sophisticated technique straight away and as you said, you were thrown into the deep end on that. I'm wondering though, from the auditor's side, are there any common challenges they come to your team with? Are you seeing anything where there's a repeat need to provide assistance, whether it be through certain techniques or other things?
Stephen:Yeah, definitely. And I think that these are the things that are the perfect candidates for being put into the skills that you demand of the performance auditor. But the one that I have people reaching out to me the most for is power analysis for doing statistical tests. So it's typically a yes/no question. You're looking at some sort of binomial distribution and you're trying to figure out how big your experiment needs to be in order to give a sufficient level of certainty. And the question usually comes over the fence in the form of "what does n need to be?" And then the question comes back from me, well what is it you're trying to say? And what sentence is it that you want to put in the report? The sentence is going to be of this form. You Know, so it's going to, typically you're not saying, at the 95% confidence interval and all that kind of jazz, because you want it to be readable. But at some point you're going to be able to make some statement. We are confident that at least 10% of X is bad, or we're confident that at least 90% of X is good or whatever it is. Then you start working back, how much effort does it take you to get one of these responses? Is it send an email to someone or is it we're going to be able to send the survey out and can get thousands back without much effort, or is it you're going to have to go retrieve documents from an archive and do a manual analysis of it? So the question obviously then becomes, how much work are you willing to do in order to get that extra percentage point of certainty. That's typically the one that comes back. That represents a shift in the way performance auditing is done, and the fact that comes across the fence so often is really the argument for that should be one of the core skills of a performance auditor. And it wouldn't be everything like it wouldn't be, using the real corner techniques. So how do you assess that confidence interval when you're getting up to around 99% as your response rate and things like that. And things start to get really hairy when you get into these really odd corner cases. And that's where we would jump in, but where it's a really run of the mill understanding how many things you need to ask, how many samples you need to take in order to get a given level of confidence that is a question that every performance auditor is going to be expected to be able to answer in the coming years.
Yusuf:Have you had the opportunity to engage with other audit offices, either locally or globally, to share ideas or insights or challenges, that sort of thing?
Stephen:One of the big organizations that we have here is ACAG, Australian Council of Auditors -General. That's one of the real ways that we share information around. So as I was saying before, a relatively small team working in this area. A lot of the work that we're trying to do is really hard. And it's hard to scale up really. If you had a bigger team, you had dozens of people on the team, there's certain things that you could do, certain things you could figure out, that become really onerous when you're in a small team. There's a community out there and there's a lot of us who speak to each other. ANAO have some people out there that I get in touch with regularly, we were planning on presenting together at Impact before COVID came in - at a conference there that VAGO was running. That's one of the key things is just that real collaboration. It seems like trivial things, but one of the things is like a role description. We hired a grad, that involved writing a role description. It doesn't seem like that much of a task, but you multiply that by every audit office in the country, and all of a sudden it's, there's quite a bit of effort been burnt on that, but we can share those around. Hey, we hired a grad, this is the description we used. That's something the other audit officer can potentially pinch and adapt for their own use. When we're talking about this competency for performance auditors around, what it is that we expect them to do, they're the kinds of things that we can really collaborate on. The nature of the work means that you can't necessarily collaborate on all the nitty gritty of the actual audits that you're working on, but a lot of that sort of administrative side stuff, there's definitely a lot of potential for leverage on that. And once the work has done itself, then of course you can, sharing how you did and the techniques and all that kind of stuff is something that's, that's really important. There's an internal knowledge share session that they do over in the audit office over in Adelaide, which I'm going to be presenting at in a couple of months time, on some of the recent work that we've done. Anything that we can share with them can help them and equally anything that any of the audit offices come across, if they can share with us, it just means that all of us don't have to fall into all the same traps. Internationally as well, hasn't been a whole lot of direct collaboration, but, the stuff that does get made public, going through that is a really great resource. The Government Accountability Office in the States and the audit office in California, do some really great work. Even just seeing the kind of work that they're doing forces you to raise the bar a little bit.
Conor:What does the future hold for data in performance audit either broadly or in terms of your ambitions for VAGO?
Stephen:Our ambition is that we're really squeezing every last piece of juice out of the data that's available. We throw the word exploiting around quite a bit, we want to exploit the data as much as we possibly can. It's just disappointing if you get to the end of an audit and there was any piece of evidence that you didn't leverage as much as you possibly can. There's obviously the potential for data to be some of that unleveraged potential, if you don't, if you're not careful. You know how we get there, I think is by the skills of everyone lifting up. The kids coming in now, the graduates and the young folk coming into the teams, are coming in with really amazing skill sets. They're doing coding in primary school now, when that generation comes through, it's just going to be natural for them to take on a lot of this stuff that we're seeing as being really specialists skills are going to be seen as basics, in the future. We see being able to use Word and Excel in some basic way as being fairly standard skill sets. I think coding and ability to interrogate data, that's going to be just something they can do. They're going to be doing projects on it and year 12 and through uni and all that kind of stuff. I think that's the future. The general skills getting better. I'm big on limiting my own career. So, this profession is going to be, you know, there's obviously always going to be a niche for really technical stuff. But if I told you that I was on the spreadsheet team now, I was the team that knows how to use Excel, there's just not that much call for it because everyone can do it. And I think that's really where the future is. I think it's going to be, everyone can do a lot of this stuff. What do you mean you're doing data analysis, everyone's doing data analysis.
Conor:You mentioned the use of open data and how that's been used. And we think that it hasn't been properly tapped into yet. Be interested in getting your views on that.
Stephen:When you're in a performance audit there's a tendency to focus on the agency itself, to some extent. They're the ones who you have the legislated power to ask for data from and all that kind of stuff. And you forget that there's this whole wealth of stuff just sitting there. We recently did work on accessibility of tram services in Melbourne. The data that we used for that, we got it through the agencies, but most of it was actually publicly available. A lot of the information that we used was, pulling data off the, public transport, Victoria API. So it's just basically tapping straight into that. And the other information that we got was, basically when tram services are run. So again, that's something that's piped out into apps and their API. The other piece of information that we used is publicly available through, tram tracker, which is what Yarra trams uses for telling people when the trams are arriving. The big piece of work that we did was that those two pieces of information didn't have a nice common key to be able to join them together. So we had to do a whole bunch of geospatial joining and all the horrible stuff. I'm getting flashbacks thinking about how I did that. But doing that kind of geospatial joining just to get those pieces of information together. So I think that's one of the things with public data. It's there, but there's really non-trivial work involved in getting it into the state where you're actually exploiting it. Getting outcomes out of it. The measuring performance dashboard as well, which is a good one to have a look at, again publicly available data.
Conor:They are really useful examples of how you can bring data together and anybody who's about to commence a performance audit in another jurisdiction can probably look at that, particularly the transport one, because it impacts so many people, everybody has to travel to work every day and make sure that they're getting effective services.
Stephen:When you start seeing some of the increased investment that's happened in the accessible trams announced since we tabled that, I'm not taking credit for it, but, you know, there has been an increased investment in some of those low floor trams. And yeah, I think that's one of the great things about working at VAGO is you get to see, you're doing work which has a real impact, and potentially a really quick impact. And in the next few years, those trams are going to be operating on the network and allowing, like you said, people to get to work who currently have to work really hard just to get to work.
Yusuf:How important is it for the senior leadership team to be aware of and have an understanding of how data can be used and what the possibilities are?
Stephen:The position I'm in at the moment is I'm very lucky that we've really got a leadership team who are savvy to data and really understand exactly where the value is and how it can be used. I've worked in teams before in the past where they wanted to do data stuff because you know, it was a buzzword. But the leadership that we've got in the performance audit side, we've got someone who's got a background in medical epidemiology, we've got someone who's got a background in computer science and someone who's lectured in statistics. So you've got a group of people there who have a really deep understanding of what can realistically be achieved with analytics and what can't be as well. It puts you in a position where you're not having to try and sell what it is that you're doing. They get it, they understand. I think it's really important to get people into those leadership positions, in those senior leadership positions, who have a deep appreciation for how these things work. They don't have to be practitioners of data analytics necessarily. But there's a minimum demand on statistical chops and numeracy that allows them to be able to be smart buyers.
Conor:Fascinating conversation today. Great to hear some of the work that VAGO is doing with its use of data. Some of the limitations or challenges you face, as well as the importance of collaboration, finding out what others are doing in the space. One of the key takeaways for me was that a few years down the track, there's going to be an expectation that all performance auditors and all auditors have some minimum level of capability with using data in their work. So, brilliant conversation today, has been a pleasure having you on the show. Thanks again, Stephen.
Stephen:Cheers. Thanks very much.
Narrator:If you enjoyed this podcast, please share with a friend and rate us in your podcast app. For immediate notification of new episodes, you can subscribe at assuranceshow.com. The link is in the show notes.