
The Quality Horizon Podcast
The Quality Horizon Podcast
Unlocking OASIS Insights: Understanding Your Audit Reports
In this episode of The Quality Horizon, we discuss OASIS Insights reports with Greg Fontaine, the digital focal for the IAQG, and Jake Lewin, CEO of Intact US.
These reports, generated post-audit, provide organizations with contextual information about their audit results, including benchmarking against similar audits and insights into non-conformances.
The conversation highlights the importance of interpreting these reports correctly, emphasizing that they are not scores but rather insights into audit performance.
For more information visit the OASIS Insights page here.
Susan Matson: [00:15 – 00:31] Greetings everyone, and welcome to the show. I'm Susan Matson and I have two guests with me to talk about OASIS Insights today. Greg Fontaine, who is the digital focal for the IAQG, and Jake Lewin, the CEO for Intact US, the developers of OASIS. Gentlemen, welcome.
Jake Lewin & Greg Fontaine: [00:32– 00:35] Hi. Nice to see you. Nice to talk to you, Susan.
Susan Matson: [00:35 – 01:23] Thank you. Thank you. Absolutely. So as of April 1st, people began receiving OASIS Insight reports in their OASIS file library. Today, I'd like us to dive just a little deeper into these reports. Now, I know many of our listeners may have either received one or are about to receive one soon. And while many might have an idea of what these reports are, Greg, Jake, I'd like you to shed some light on some of the more technically or technologically driven questions people have been asking. So without further ado, and that being said, Greg, could you start off the conversation by telling our listeners what OASIS Insights is and when do they actually receive these reports and also who does receive these reports?
Greg Fontaine: [01:23 – 02:09] Well, OASIS Insights is the new report that is generated as part of the ongoing OASIS activities and auditing processes. The current version of the report is generated immediately after an audit and it is designed to give some context to an audit beyond simply, you know, are we still certified? Did we pass? And what NCRs did we get? So it is issued to the organization and every person associated with the organization, be they an administrator, a supplier contact, a consultant role, any of the people that can basically have access to the supplier audits can have access, has access to the Oasis Insight report.
Susan Matson: [02:10 – 02:20] Thank you. Now, there's a number of different sections to these reports. Can you tell us what the primary components are and what each of those sections represent?
Greg Fontaine: [02:22 – 03:53] The main page is kind of the summary, and that gives a definition of the benchmarking cohort that they are being compared to. It gives information about the audit, so you have which audit is being done. It has an audit rating. It will have eventually historical stuff as these go forward, and then it has the benchmarking and audit results broken down by kind of how the scoring goes. The next section On the second page is experimental. It's an AI-generated insight based on what it found in the audit, typically things related to the non-conformances and stuff like that. Then you have a series of graphs and information about the scored, non-scored information so that you can see the distribution of major and minor NCRs for that particular audit cohort type, et cetera. And then finally, we go to some broader benchmarking and distribution charts so that people have an idea for everybody in their cohort where the most common NCR distribution, number of on-site days, things like that are. And then finally, we are including available resources based on what occurred in the audit. At the moment, that includes the AIM content for MCRs that they incurred in the audit and will be including SCMH guidance material in the future.
Susan Matson: [03:53 – 04:07] Sounds like there's a number of different tools and data and metrics and information being provided. Jake, how is one to interpret the different points information being provided in these insights?
Jake Lewin: [04:07 – 05:16] Yeah, you know, that's a really interesting question because there is a lot of data and a lot of information, and I'm sure we'll get to it. There's more, even more to come. You know, the way we think about this is, first of all, achieving any 9100 series certification is a huge achievement. And this is additional context on top of that achievement. And the fundamental basis is that it's a moment in time in the context of this audit compared to similar audits. So it's contextual. It's contextual to that moment and to those other audits. And it's not really intended to be a rating of the supplier as a whole. but more of a insight into how that audit went and what were the outcomes. I guess it's important to not over-dramatize the results, but also use them to help drive performance and achievement and understanding.
Susan Matson: [05:16 – 05:38] That's a really interesting point and I do want to make sure that our listeners, especially those new to this because it is relatively new, There are some misunderstandings. I know some people talked about that rating actually being a score, and that's not the case. So can you, Jay, kind of tell us a little more? What are some other misunderstandings that we need to clear the air about?
Jake Lewin: [05:39 – 07:13] Well, you know, yeah, this is interesting, right? Anytime you have a score, Scores are we're conditioned right we live our whole lives around scores and grades and whatnot so scores in a way are scary and probably the biggest pitfall or danger here is to make a rash decision on a low score. But you know it's I think it's important for everybody in in the process to understand that. Any kind of score is in the context not only of other audits, but also what is being achieved in that audit. An organization that is large or complex or is even requesting a very rigorous audit might have any number of non-conformances. That in and of itself is not bad, particularly when you take into context that this is all coming forward at the conclusion of an audit being published. at which point these organizations have actually worked to resolve non-conformances. So really what we're looking for and what we think is important is that the reports are taken in context of their moment in time and that organizations work together and also within their organization to not over-dramatize the results, but rather use the report to improve their QMS system and ultimately the goals of the QMS, which is on-time delivery and quality.
Greg Fontaine: [07:14 – 07:48] Again, along those lines, I think it's important that when you're looking at a one-day surveillance audit, that's part of a multi-site or something like that. Not all audits are created equal. Certain things are covered during one part of an audit cycle and not during the other. The resulting level of complexity means that while we're looking at this, it really is a point in time. It's not a supplier rating, it's an audit rating. It is looking at this single event compared to other similar single events, but that doesn't mean all those events are created equal.
Susan Matson: [07:49 – 07:54] Let's all be completely honest. This is after the fact that they have achieved a successful audit.
Greg Fontaine: [07:54 – 08:16] Yeah, again, as Jake pointed out, you know, they've either been certified or maintain their certification. Whatever NCRs occurred have been closed before this is generated. So, even if there's stuff that shows up on the audit, it's been resolved by the time this report comes out. And, you know, again, that's not insignificant.
Susan Matson: [08:18 – 08:39] Absolutely. Let's switch a little gear and kind of dive, go into the technical aspects. So what kind of data sources are used and integrated into these reports? And from someone who is receiving the reports, how is my data, meaning if I'm one of the listeners that just got a report, how's my data being used?
Greg Fontaine: [08:40 – 09:27] So the Oasis Insight report is based on an AI comparison of the, or a data comparison of the actual audit results from all the audits that are out there. Now, they're anonymized except in your own report. So the number of NCRs, a certain audit of a certain type for a certain standard occurred is used generically as a comparison point for a specific audit. An individual set of data is still only available to and only in the report generated for that organization. It's anonymized except for the actual audit report that that organization is receiving where they get to see their own results.
Susan Matson: [09:29 – 09:41] Jake, walk us through one of these reports. How is this data collected and processed and presented? I have an audit, and now I have the report. What happens there?
Jake Lewin: [09:41 – 10:53] Well, what happens to get there is a system we call automagic. It all happens automagically. So really what's happening is everything is relying on the base data set, which is all the audits that have ever happened. In a given standard and then as each audit occurs that builds the ongoing data set. When an audit is Published. The system speaks to the processing tool and it says, hey, an audit was published, you should look at it. And it picks it up and it says, OK, great. What kind of audit is this? How long was it? How does it fit into? Who are its similar audits? And then it analyzes that, builds a data set, gives a score, records that for posterity, and then says, great. now I'm going to render a report, I'm going to render a bunch of graphics, renders that report and sends it back to the supplier through the system. All of which taken as a whole is automagic.
Susan Matson: [10:54 – 11:09] Automagic, there you go. Is there a particular type of model used to analyze and summarize these reports? Because there is some real meat that comes out of it that people are finding valuable.
Jake Lewin: [11:11 – 11:55] Yeah, I think of it as two things running at the same time. The first is a series of more or less straight ahead data science algorithms. So that's data science doing its work in terms of doing comparison and benchmarking and whatnot to provide comparisons. And then there's LLMs. This is basic AI large language models that are doing some of the data, some of the more outside of the benchmarking, but more of the analysis that's doing some of the comparisons you see on the second page. So straight ahead, almost basically statistics and then large language models.
Susan Matson: [11:56 – 12:14] So taking all of that information, Greg, how can an organization use these reports to identify areas that they may need to improve or they have opportunities for even additional further improvement to take a really good system and make it even better.
Greg Fontaine: [12:14 – 14:06] Wow. I mean, I think, you know, first let's be clear, you know, I think this is the first iteration of this. It's going to improve. Um, over time we have several further iterations planned, but the bottom line is that the real goal of Oasis Insights, this, this first revision of it was to give that give context to an audit. Right now audits, you know, you get audited and when it's all done, your boss goes, how did we do? Did we pass? You know, are we still certified? How many MCRs did we get? And, you know, you can answer those questions, but you don't have any idea in the greater scope of the aerospace scheme and the IAQG standards, how that is, what that meant and how you did. So, Really, that context is huge and that you can look at this and not only understand objectively how your audit of this type compared to other audits of that type is a point in time, but those further graphs that it's generating and the distribution charts will let you know whether, hey, did I get an NCR in the most common place or an uncommon place? Is this where other people get tripped up or not so much? In addition to the actual context that it gives you and the AI-generated results that'll list some key problems that occur in typical NCR areas, you will also get linkage to IAQG resources. At the moment, it includes AIM, as I said earlier, but the we're gonna also include SCMH. So not only do you get a context of how your audit did, but we're giving you specific to what occurred in your audit, resources you can use to improve. Does that make sense?
Susan Matson: [14:06 – 14:43] It does. And it really kind of lends into the question that was already in my head about the benefits of these reports. It really does provide an enormous amount of benefit if you really kind of dive into these reports. And as you said earlier, there's more to come. So Jake, I know we're about, what, a little over a month into people and organizations receiving these reports. So I don't know how much, but I'm sure that some things you guys have been seeing, even from the developer standpoint, about some best practices. Are there some best practices for integrating these reports into maybe some planning?
Jake Lewin: [14:44 – 16:02] Yeah, you know, I mean, I think for integrating into planning for any organization, it really boils down to receiving the report, taking a deep breath and looking at it and going beyond just the first page, but really spending some time to think about, OK, what does this mean? What does this tell us about where our opportunities might lie? How are we perhaps perhaps as a citation for a non-performance that is rather rare. Well, that's an opportunity to think about, well, what is occurring there? And really, I want to underscore what Greg said, you know, the aerospace improvement maturity model content that is there is really quite thorough and has some really interesting, deep content about what The maturity levels look like for an organization in various areas of the standard and there's just a wealth of information there that an organization can use to think about how to mature. And it's targeted right to the areas they already had an issue. So really valuable. And one of the things we really like about this is how it brings forward that IAQG tool to make it actionable for suppliers.
Susan Matson: [16:02 – 16:41] And I think you both did say this on two separate occasions, but I do want to underscore the fact, the aerospace improvement maturity model, that is integrated into some of the reporting, correct? So that's all right there for people to take what they're using and the benefit right off the bat. But I don't think we're stopping there, are we? So Greg, future developments, things like options to customize or tailor Oasis Insights for special specific needs that maybe an organization has. Is that even something available now? Might it be coming? Could you talk about that a little?
Greg Fontaine: [16:42 – 18:53] We have a series of iterations planned, and while I don't think any of them are going to really be kind of anything, quote, customizable or tailored specifically to the organization, we are getting a lot of feedback about a lot of ways people would like to use this. One of those could include for organizations with a lot of sites, you know, there's been some interest in being able to see a more aggregate report of all the of the most recent audit across all their sites or something. There's been a lot of interest at the General Assemblies and some of the meetings on ways that this could be included into the larger like supply chain. Those are all under discussion. The future developments we have planned are more focused on enhancing this report. This was our first take at it and There is a lot of stuff being discussed from, right now we count all the NCRs kind of the same by major and minor. We know that not all NCRs are equal, you know, some are paperwork and some are really affecting the output and the manufacturing quality, you know, the tangible products. So, you know, while all are important, we're not going to say all are equally important. So there's going to be a lot we try to do to improve this. We want feedback on it. Everybody who gets this, an insight report that, again, is generated, you know, for everybody this year that's already had an audit has gotten one, but everybody else this year that hasn't had their audit published will get one. And we want that feedback about, you know, what they'd like to see and what they found useful. We've already got a lot of feedback. I can tell you this, well, I'm not going to go into a lot of details on it. Over 85% of the respondents to the survey, and we've had, you know, hundreds and hundreds and hundreds and hundreds of them, thought it was useful and over 40% thought it was very useful. So, you know, I think it's being well received and that's only going to encourage us to keep trying to improve it and expand the ways it can be of use to everybody in the industry.
Susan Matson: [18:53 – 19:08] Thank you. Definitely for the overview. Jake, I'm going to follow up. I know there's got to be something that is on the horizon pretty soon in terms of specific enhancement. So are there some future releases we should be thinking about?
Jake Lewin: [19:08 – 21:30] Yeah, absolutely. Currently, we have approximately three major iterations planned and another possible report. Each of the iterations is intended to batch together a series of improvements and kind of deepenings of the report. And the kind of things we're working on in those iterations would be, we're looking at the possibility of having a additional executive summary, more analysis and content from AI in terms of, say, interpreting the graphs. for the user in plain language, right? More of that. Over time, as additional data is collected, more analysis about the instances of a repeated non-performance, right? That's data we're gathering that's not yet scored. Similarly, our plan is to integrate the supply chain management handbook content, just like the aim is there. And additionally, there are plans and what we're looking at is additional graphs and cross tabulations that give a deeper dive for those people who want to, you know, kind of receive an analysis report and really dive into additional graphical data. So all of that are planned in successive iterations. And on top of that, we are currently investigating the content and the possible utility of perhaps a pre-audit report that suppliers could receive before their audit to help them prepare, maybe to identify possible areas that impact organizations like themselves. So, ultimately, we could get to a place where we have a very thorough post-audit report that is more interesting and bigger than the one we have, and a pre-audit report is the kind of horizon we're going towards. And really, you know, I have to kind of salute the IAQG in their thinking about this and planning for a longer term future than a, you know, set it and forget it approach. This is very much a part of a journey.
Susan Matson: [21:32 – 21:55] Absolutely. Always ever evolving, it sounds like. Not only insights, but all of the digital innovation enhancements because Insights is just one of them. And Greg, I would be remiss if I didn't ask you to talk about some of the other developments in the digital world for the IAQG that fall under that digital innovation umbrella.
Greg Fontaine: [21:55 – 24:06] Yeah, April and going into the General Assembly was a big month for digital innovation in the IAQG. We've got three initial things that launched and were done. One hasn't quite been rolled out, but it is there and working. It's more a matter of figuring out how we're going to enable that sharing. So there'll be a future podcast on that, I'm sure. But the three areas that we were focused on with the digital innovation for 2020 Obviously, the Oasis Insights was one piece. There's what we're calling IAQG Intel, which is a business analytics intelligence platform that is now taking all of the data that the IAQG has accumulated over its lifecycle here, and making that data easier for us to get at, slice and dice, analyze, and make useful to the various stakeholders in in the aerospace community, as well as some data services, IQG data services. And all of these things are going to go through iterations, just like Jake was saying. We're going to move the… Insights through iterations this year the data services started off with a which is simply a data feed and will evolve into a more API type interactive option for people. But what we're basically trying to do is not only are we making through the business intelligence service it easier to get to. the underlying data that the IAQG need to analyze, but we're taking the data that's in search and track that a lot of the industry relies on, and through a data feed, we are making that available to them if they want to programmatically consume it into their systems versus going and searching manually on search and track. So we're going to better expose the data that we can to the stakeholders. We are going to use the data we can't expose to improve the scheme, and then we are providing the insights through OASIS Insights to the actual certified organizations so that they get some useful context out of the audit process.
Susan Matson: [24:07 – 24:41] Sounds like data and digital might be the two key words for 2025. We certainly hope so. Sounds wonderful. So, Jake, for our listeners, I know we talked about how to get some of the, you know, what's involved in the reports, how to read the reports, but there's got to be some guidance material for those that would like to not only listen to this podcast, but also read some materials before they really dive into the report that they're getting. So can you tell us where some of this information is available?
Jake Lewin: [24:41 – 25:44] Yeah, happy to. You know, both in the landing page for OASIS itself and on every report is a link to the service desk that goes straight to a dedicated OASIS Insights area. And within that is a section by section breakdown Defining the content and how it comes to be and how it's calculated and we're really happy about how this came together because one of the things that it does is it. breaks down and reverse engineers the math behind it so that users themselves can understand how a score is generated. And there's completely transparent examples of graphs and whatnot to show how a score is gotten to. I think that that's really healthy because it helps everybody kind of level the playing field so it's not a mystery and it's um so it's there it's on the service desk and it continues to be updated as the iterations occur.
Susan Matson: [25:44 – 25:57] Thank you. Gentlemen that's pretty much all we have time for today. So thank you so much for joining me and helping our listeners get a better understanding of OASIS Insight Reports.
Greg Fontaine and Jake Lewin: [25:59 – 25:59] Happy to do it. Yeah, it's a pleasure.
Susan Matson: [26:00 – 26:28] As a reminder, Jake just told us where these guidance materials are available, but they are also all on the knowledge base and the OASIS knowledge base. And you can get to that by at oasis-help.iaqg.org. Gentlemen, best of luck. It sounds like you're still rolling up your sleeves to get more things out for later in the year. So thank you and look forward to hearing more from you soon. This is Susan Matson and you have been listening to the IAQG Quality Horizon. Until next time, stay safe.