Software Quality Today

Unlocking the power of S.O.U.P with Jackie Davidson

January 20, 2023 Dori Gonzalez-Acevedo Season 3 Episode 1
Software Quality Today
Unlocking the power of S.O.U.P with Jackie Davidson
Show Notes Transcript

Dori Gonzalez-Acevedo Co-Founder & CEO of ProcellaRX welcomes Jacqueline Davidson, Owner of Davidson QA Consulting and Head of Regulatory Intelligence and Innovation at Sware, Inc to Software Quality Today.

Download now to listen to Jackie’s perspective on S.O.U.P, (Software of Unknown Providence) what is it and how can you leverage it within your organization.

Innovation is driven by scientists; they test the limits of their ideas as quick as possible to see what’s possible and vet out what’s not – one way to do that as efficiently as possible is with the use Open-Source technology/software rather than having to build their own.  As an industry we need to be able to shift to use S.O.U.P. where possible in order test our hypotheses and potentially save critical time to market as a competitive advantage. This often comes by using open-source technology but how can we if we don’t have ultimate control over that software?

S.O.U.P may not be a term you may not be familiar our guest today clarifies, puts it into perspective and how it could be utilized in a heavily regulated industry such as life sciences when it’s fit for purpose.
 
Sware LinkedIn

Sware Website 

*Disclaimer: Podcast guest participated in the podcast as an individual subject matter expert and contributor. The views and opinions they share are not necessarily shared by their employer. Nor should any reference to specific products or services be interpreted as commercial endorsements by their current employer.

This is a production of ProcellaRX

Dori Gonzalez-Acevedo:

podcasting from Alexandria, Virginia, just a few miles from Washington DC, where we all hope doing what is right the first time is everyone's top priority. This is software quality today presented by Purcell RX, a podcast about the trends and challenges of software quality testing and computerized system validation. And the people who are leading the way. Here interviews with special guests and news from customers and vendors. I'm your host, Dori Gonzalez-Acevedo, and welcome to today's episode. Welcome to another episode of software quality state. I'm your host Dori Gonzalez-Acevedo. Today I have the pleasure of interviewing Jacqueline Davidson, owner of Davidson, QA, consulting and head of regulatory intelligence and Innovation at Square. Jackie is an IT Quality management and compliance professional with over 25 years of experience in life sciences, spanning pharmaceuticals, biotech, medical device diagnostics, and clinical labs. Jackie enjoys helping small and large companies streamline their quality processes, especially as it pertains to IT Quality Assurance software lifecycle management, quality risk management, and software validation and verification. Jackie brings a wealth of expertise in CSV and regulated system deployments. And today we explore the boundaries and the levels of validation that are required for different systems, who's responsible between vendors and sponsors, tips for how to prepare for audits, and she delves into soup software of unknown provenance. This might be a new term for folks, and I'm hoping that you enjoy our conversation today. To learn more about how it's playing a bigger role within our industry. What does it mean? And how do we need to apply CSV principles for industry moving forward? how companies can demystify the assurance of open source soup applications so that they can be inspection ready and able to help companies speed the pace of innovating products to serve the higher purpose of promoting product quality and patient health and safety. So without further ado, please welcome Jackie. All right, well, welcome Jackie. How are you today?

Jackie Davidson:

I'm great. Dori, how are you?

Dori Gonzalez-Acevedo:

Good. Well, thanks for joining us on software quality today. I'm really excited about our conversation today, because we've had some conversations over the last year getting to know each other and I really happy we get this chance to record it for

Jackie Davidson:

real. Yeah, well, thank you. I'm glad to be here. This is the first time I've ever been on a podcast. So I'm a newbie to this. But I was awesome forward to chatting with you and imparting some information where I can.

Dori Gonzalez-Acevedo:

Awesome. All right, so Jackie. So tell us I'm going to ask you for a brief introduction as the head of regulatory intelligence and innovation for software as well as other providing consulting services. Tell us what you do and how you kind of came into the space.

Unknown:

While I've been in the biotech life sciences, pharma space for Well, I just actually looked the other day and I'm closing in on 30 years. I started out in medical device down in San Diego after grad school and ended up moving up here in the mid 90s. And working with alza Corporation for quite some time in it doing at the time, I was doing more training and technical writing, but they were really struggling with validation because part 11 had just come out. Nobody knew what to do. They were really confused around testing. And even at that stage, we were trying to write life cycles and stuff like that, and come up with a sensible way to approach it as opposed to test everything. So from like the first time I saw part 11 and saw people panicking. Today, I've been trying to get people to realize that quality is about making the right thing easy. And validation is not necessarily an exercise in paperwork, but an exercise in thought and compliance and how you're going to come up with evidence that truly crystallizes what you tested and the application suitability for intended use, as opposed to 1500.

Dori Gonzalez-Acevedo:

Yes, and that in lies the challenge, right. And over the course of our our careers, right, we've seen, I'm sure a lot of variety in and how to get there some more elegant than others, I should say. And so along those lines, though, I know you have some passions and things that you're writing today and things that you are really interested in in sparking some interest in the community.

Unknown:

Well, I've got a real, a couple things that I really love. I have an odd penchant for supplier audits, especially in the area of it. Because I feel like getting an understanding of your suppliers at an early stage is going to help you in the end, have a better relationship with them overall help you have a better validation, or Software Assurance outcome, and help you had problems off the pass. And as part of that, of course, very conveniently this year GAMP came out with their second edition, which of course focuses on that. And then of course, the CSA initiative, which has finally come out in September is I know it's out for comments. I'm not sure when we're going to expect to see that become released for real, right. Final guidance. I

Dori Gonzalez-Acevedo:

sparks conversation. That's enough, I guess.

Unknown:

Yeah, yeah. And to the end of that, that's been a real source of confusion for people. And to me, I feel like it's one of those all the little threads are coming together, the little threads of how do we assess risk? How do we assess suppliers? How do we know that we're getting the right amount of testing? How do we know we have the right deliverables? And that falls across the spectrum for CSA so I just finished a white paper for software that will be will be published probably after the first of the year, that discusses kind of CSA, how it's a bridge to innovation, for Life Sciences, and how we can leverage those approaches to come up with a common sense way to show that our applications are fit for intended use and have the right deliverables. And that makes people less would you nervous around? Oh, my God, can I really take this approach? Well, one

Dori Gonzalez-Acevedo:

of the things that you're just saying that strikes me interesting is that, you know, so I was a chemist, and I made active pharmaceutical ingredients. So my upbringing is from manufacturing, and the relationships with our suppliers in the manufacturing area, are very, very intimately tied. And I wonder, I'm wondering what your thoughts are around the fact that that in the supply chain has been happening for a long time, right. And most of the auditors that we know you and I know professionally as well, right are very, from that mindset in that manufacturing mindset, but haven't yet shifted, or moved towards an understanding of how to do it vendor management, or it audits and what that relationship is between IT and the business and the sponsor. That's about that.

Unknown:

I think you're right, that's been evolving, and that that was a huge part of my job. I was with Jazze pharma before I went to swear. And we had a tremendous program around it, vendor auditing IT vendor management. And then there was some struggle at points because it was like, Okay, so at what point is, you know, if we're bringing in consultants, at what point is that a staff augmentation versus having to qualify and train the staff? When you're bringing in application? You know, who's controlling the application? Is it our internal IT department? And therefore, you know, do we need to do a different level of testing and training? Or is it the vendor that's managing it, you've got systems like viva, that have three times yearly releases, and you have to be able to get those validated in time, you know, you can't wait for that to pass because you'll be out of compliance. So at the get go, we were finding that we really had to veteran vendors early on that it was more than just, Okay, we've selected this vendor. Now go audit it, because that was my experience. For many years, it was like it, Jackie, we signed a contract with such and such go out and audit them. And I'm like, Okay, you just let the horse out of the barn, because of this vendor turns out to be problematic. Now, we don't know you've already signed the contract. And that had been my experience, I had seen that happen numerous times. So pivoting to an earlier audit. And that was one of the things and, and, oddly enough, the pandemic really helped because it made it easier and quicker to do an audit because you were doing virtually and really, you don't need to be on site for audits, because most of them are decentralized, you're not going to look at server rooms anymore. You're not looking for temperature controls. You're not looking for the same things that you would need to when I was doing audits for labs that I worked for, and we would have to go walking through the manufacturing plant, right look at the clean rooms, you know, check, pest control, check Eyewashes, check, spill kits, all those things, you're not needing to do that. So so it was a lot easier to get more proactive. Then the it audits. And also as part of that, to really develop a relationship with the vendor. And I have kind of a little different philosophy as an auditor, I'm definitely very risk based. But I try to make sure that I, I've always been told, I'm like, the nicest auditor I've ever had, which is, it's a good thing. And a bad thing. I mean, I'm nice, because I'm nice. I'm not nice, because I want something. But in the end, I get what I want, because now nobody who ever listens to this podcast will ever want me. Because they always give me a lot of information because I get into conversations with them. It tells me a little more about this process. Can you show me these things, I asked for a lot of stuff up front. So to save time, because nobody in their right mind wants to be on Zoom calls for 16 hours, you know, two days of eight hour calls in a row. So I tried to be really proactive in terms of sending a comprehensive questionnaire upfront, asking them to share as much information as they can and feel comfortable with upfront. Because another thing as you know, that IT auditors hate and probably the oddity hates, is going step by step through procedures and making you read them on screen. So being really proactive upfront using automation to every extent possible, whether it's a sharing platform where we can share our SOPs, share the data, share the objective evidence, whether it's being able to send them stuff to fill out proactively, so that I can then compare notes and add in. So there's all those little things that make a difference. And do you have

Dori Gonzalez-Acevedo:

that's really helpful? Do you have any tips for for sponsors to prepare better for those things with their vendors? So I guess, so there's sort of two things as a third party auditor coming in, right, like you, there's, there's the vendor, and you're mitigating that with the sponsor. So that's, so which side what what do they both need to bring to the table to be effective?

Unknown:

I think that when you're a pharma company, when your sponsor, first of all, having a clear set of policies that that tell you, Okay, what kind of audit do I need. And by this, I mean, because sometimes you're auditing software that's like part of a device. So you're really auditing more than just the software, then there's just simple software systems. So first of all, understand what you're auditing, and have a policy that helps to guide you almost like a rubric or a flowchart. I have a flowchart, where it's like, if it's this, I'm going to do this. So that's the first thing. The second thing is to, I have a set of questionnaires, like questions that I have, and I map them out to the relevant regulations. So I know, if I'm doing medical device, I need to look at party 20, I need to then look at suppliers, I need to then look at all the things under that if it's a European organization, I might need to look at annex 11, if it's US and European might be looking at annex 11, part 11. So understanding the sponsor, understanding what they need to audit what their relevant regulations are, what their concerns are, is one thing, and from there having questionnaires and you're going to need to customize those. So I have several questionnaires in my toolkit, if you will. And I customize them depending on like if I'm looking at a company that's just mining data. So for example, we did an audit of a company down in Texas that does they mined data out of military databases is to get like patient data for different types of things that were prescribed. And you can then use that for kind of non life studies for data comparisons of maybe your product versus another product for that. It wasn't a software system. So I knew I didn't have to have all the questions about they weren't going to access any systems that were internal to the company that I was on.

Dori Gonzalez-Acevedo:

So it sounds like you do your homework. Yeah, doing your homework ahead,

Unknown:

knowing are they going to touch your data? Are they going to control anything? are we controlling it? Do we have control over when things get rolled out? Or we just mining data? So knowing the purpose, knowing the system doing your homework ahead of time? Yeah, is number one. Number two is knowing what questions to ask based on that. Sit, you're being wise about the time always talking to your oddity, once you kind of have done this homework, talk to them, figure out the agenda. You know, make sure that the agenda is appropriate to what you're doing. Setting up fair times like all these things, being cognizant of, you know, if you're auditing somebody in Bangalore, you know, come up with something that's fair to both of you because nobody's on their game, if they're up at midnight, right and it's always Unfortunately, one of the other person. So there's a lot of like, it's a lot of prep work. Having somebody in your on your team that handles the scheduling. So, you know, when these audits are going to happen if they're scheduled, if they are for cause, you know, really talking to your internal quality team understanding, if therefore, cause what are the outstanding issues that have driven this audit, if they are routine, also reviewing all that internal stuff. So it's, it's really, before you go out and audit, don't audit blind, and I've seen those I've been on the other end of those where they're just coming in, and you can tell they know nothing about what you're doing. Right?

Dori Gonzalez-Acevedo:

Yep. So that's great. Thanks for sharing that because I, and I know we kind of maybe swerved a little bit but that this is a this is a hot topic as we're shifting towards or wanting to shift towards better risk based approaches. And that risk based approach extends through our vendors, and educating our vendors as well as the sponsors on what the shift is, and how we all have to come to the table and redesign what that is, right? In this new way.

Unknown:

It's not something that's new, this is the part that kind of back in Gosh, it's gotta be 2011 or 2012. I know.

Dori Gonzalez-Acevedo:

We're going to age ourselves here, go ahead.

Unknown:

company down in San Diego, it was a lab. And in fact, they we had had a document management system that we didn't like, so we were switching, and we actually end up switching to Viva and we were Vivas first installation versus like regular installation. And it was in fact, where I met the CEO of who's now CEO swear he was then the VP of Sales name is Brian Ennis. And Brian came to our site. And he's like, you know, what can we do to make this, you know, work. And at this point, I was a consultant to this little lab, this little genomics lab, and I hadn't been part of the vendor audit process, because they had done that ahead of me. So what can you show me what you have? What can I leverage? If you're telling me that you're going to be releasing these updates three times a year? And I've got a, you know, what can you what can we leverage? Can I review it, so I kind of did the secondary mini audit, to see what I could leverage and actually did some testing against it in their sandbox, I wanted to see because this was the first time we had done this, it was completely new to leverage and vendor documentation, right in that way. So let me do some testing. So I did an initial kind of smoke test, it's okay, they really did test these things, it really did work. So we decided we made a we create a validation plan, we said and that's another part is if you're going to leverage vendors, and you're going to go out and audit them, make sure you specify that in your project plans in your vendor, in your your your all your vendor audit documentation, in your validation plan, make sure you're doing this ahead of time that you're if you document what you're going to do, and you do it. Unless you're grossly missing something right, you're okay generally come out find it an inspection I have in all those years that I mentioned, I have never once in an inspection been called out on software validation, on leveraging vendor documentation on how we've managed change, because we've always clearly set it out. So if I'm auditing my vendor with the intention of using their documentation, making sure right, that's there. But there's there's something else that's that's come to my attention recently, especially with the advent of the CSA approach, because I see that people are becoming more comfortable with that risk based and leveraging the vendors. And there's certainly validation automation platforms out there that will help you manage that workload, especially if you're one person managing 15 of those platforms and and knowing the stress of okay, it's November 15. And I've got five of these, and how am I going to get it all done, and I don't have a big team. So being able to leverage that being able to justify what I'm doing being able to regression test, like the smaller things that I might have configured. Those things are really important because but I've got a known vendor. And generally speaking, the more robust your vendor is, when you come out of an audit, and you know, this vendors really robust. I don't have any critical or major observations. If I've got a minor one, they're able to address it in a timely manner. So I know I'm pretty confident that I can, you know, use that CSS approach to use that gap approach to leverage what they've got. That's great. But What happens when you are a medical device company, you are a biostatistician, you're doing something completely new. And you're leveraging software of unknown provenance, which we'll call soup, or off the shelf software OSS, or FTO. Call it OTS that you don't get to look at the source code. You don't have all the documentation. You don't know who all's been working on it. It's like kind of crowdsource this thing is something like our which is a very well documented piece of soup.

Dori Gonzalez-Acevedo:

So let's learn, can I can I slowed you down for a moment? And just let's circle back and educate our listeners? Because I this is new, new areas for some and I think it's worthwhile kind of just taking a pause. And going through these use cases, Jackie, that you're going to identify here? Because so software of an unknown provenance or soup software? How would you define that?

Unknown:

It's software that a lot of the time, it can be something that you're downloading from the internet, that's at the simplest level, something you're downloading from the internet, like a PDF Converter, but it could be something bigger, like, you know, I mean, our is very well documented as far as a statistical processing program. But what people don't realize, and Big Pharma, and a lot of companies, so maybe

Dori Gonzalez-Acevedo:

this is so important, because this is, so there's a lot of data scientists out there. And there's lots of information that are being developed in small buckets and in an open source sort of way. And those, let's call them widgets, for better, right, like, so they're widgets, and that you can go and grab from the internet, right through a university or a small company, or whatever they're developing, and you can utilize that in your process. Right, right. And so that's what we're talking about is really going this is like this is kind of like research and university stuff on on steroids, right? is pulling that, that that native technology into actually integrating to a medical device potential, right.

Unknown:

But what makes it makes it and rightfully so it makes a lot of companies squidgy about this, because you don't know the pedigree. It's kind of like going to the dog pound. And I love dogs and getting that puppy. And it looks like a Labrador, you bring it home. But three weeks later, it sprouts ears and sprout fairy tale, and you've got colleagues, and you still love it just as much. But you don't know its pedigree. And this is the same thing. Sometimes you get one of these widgets, you don't know what's in the code. You don't have any control over its lifecycle. If they update it, or they patch it, they're not going to let you know. So if you're especially if using one, maybe you're you're downloading or something that's like a like a subscription, it could change on you, unexpectedly. At the same time, I'd say that soup and open source get a bad rap. Because I'd like to point out too, that at this very moment, you and I are using soup, and you don't even know it. If you're on the internet, you're using TCP IP, which was invented in the 1970s as part of research on radio packet controllers. It morphed into the internet with a very small user base. And I pretty much guarantee you that no quality manager that I've ever met, including myself, maybe you have you ever thought Dory about validating TCP IP? No

Dori Gonzalez-Acevedo:

way. Not at all.

Unknown:

Nope. So some of this stuff is so ingrained that over time, it's by use, right. So to the same extent, when people come to me like Jackie, I don't know what to do with this. So they would take one of these widgets, and they would try to pick it apart, validate it, and and you know, and back in the day, I remember literally sitting down with these old guys from IBM, that were our validation team at alza. And they'd be like, well, you know, first we have to turn the machine on and make sure it boots up. I'm like, Well, that's an IQ. Let me point out that if we can't get the machine to boot, right, then it's an automatic fail, so we can't use it. So let's let's think about like getting past that IQ especially because nowadays there is no IQ. You know, it's more like an O Q where performance qualification or UA t. So, the first thing that I tell people if you're going to use any of the is open source software is to understand your what kind of open source software you're using. And understand the definition because in the eyes of the FDA, anything that you don't control, the lifecycle of is kind of off the shelf open source. Because you can't, you can't look at the source code of the law, you can't look at the source code of master control. So to an extent off the shelf is a form of OSS or suits. Yeah, even though you're buying it from and there's where you're your vendor, go back to that vendor. Right, right. Yeah, I know who I'm buying it from, they're well established.

Dori Gonzalez-Acevedo:

So this, then it also implies why fit for purpose or intended use is so critically important, right? Because if those widgets are being used in a critical function, or critical calculation, right, so that's where in lies the rub, right? If they're being used behind behind the scenes, for all intents purposes, as to facilitate operations and getting things moving, blah, blah, blah, right? Like, that's not a big such a big deal. But if they're being used to make some critical decision points, right, so let's talk about that.

Unknown:

Yeah. And also, as part of that, I want to point out, there's one flavor of soup that I kind of didn't talk about that I probably should. And that's when a device manufacturer, internally develops a software component as part of a device. But they haven't documented or followed his software lifecycle process. And I've encountered this and working with a couple of university offshoots, where they started out largely as a research organization, and they've developed this device, and then they're like, Okay, and I was brought in to help them kind of backfill the design history,

Dori Gonzalez-Acevedo:

I've seen this.

Unknown:

Oh, we've got this whole device, I walk into this lab, there's this giant machine, literally, like, okay, like, well, maybe we should start out like, What is this for? Let's write the user, then cheese back and get the technical specifications and develop that design history? Yeah. So I would say that, let's just make sure we break out the kinds of soup because not all soup is proprietary software purchased, and not all,

Dori Gonzalez-Acevedo:

just software. So yeah, I laugh because I've had that same experiencing audit, auditing some a university, and in their brilliant, brilliant scientists, right. And they can find things that you have no idea even existed before, right, and they're utilizing it in very great ways. So they have the big picture vision of what this thing is going to do in the future. But they have zero documentation on how they got to where and so when you go into the file for everyone that's listening, you have to go to File design history file for for that with the FDA to get clinical trial use, if you don't have that some of that information, or at least you can't source it, or you can't figure out where it came from, like, that's big deal. Or this, you know, depending on what the what the what the experiment or the devices that you're you're going to submit for.

Unknown:

Yeah. So my little nugget for any professors who are listening out there, because I've had professors say the stream for why do we need quality? And I'm like, because if you ever want to sell what you're inventing right now, you're want to bring it to market, you're gonna have to go through trials, so documented along the way, you know, like, just write it down. Yeah, could

Dori Gonzalez-Acevedo:

be a lab notebook, all that sort of stuff. It does not have to be fancy at all. We don't care. It's just that it would be good to know. I mean, and again, being a scientist like they do do that. It's just, I know, me translating my lab notebook into process validation documentation was it was an effort that I don't want anyone else to have to go through. So yeah.

Unknown:

Let me just back up on our definition. So I just wanted to, to give this to to help people because this is new to a lot of people. So you've got that soup that we just talked about. That's like, it's out in the internet. You're using it. It's part of your life. It's like air like nobody is validating air. TCPIP is the same thing. Then there's soup. That's not off the shelf software. That's something that we have developed, you know, Dori and I are inventing an invention and we developed we didn't write down but we wrote all that software. The problem then there's off the shelf software that isn't soup that can be two kinds. It's usually like a generally purchase like FDA says generally purchase through available system that you're using is a device manufacturer but isn't part of the device. So that can be like a piece of software controlling a piece of lab equipment like a spectrometer. So that isn't included in your device, but you need to like document that you're using it somewhere that you validated that your device is calibrated so you know you're getting the same results every time That's something that people don't always think about. But does he be talking and then there's off the shelf software and soup. And that stuff that you might generally purchase or download, like those widget story we're talking about, but that you're incorporating into the device. So it's like embedding like a software library, or system in a device. And then of course, lastly, there are those widgets or programs that get invented based on soup. And then, of course, there's freeware, and we'll kind of leave that out, because that's a whole other can of worms on its own. But those are kind of your at their own risks. So if you're downloading a program from somewhere, and this goes for anybody, anywhere in the world, doing anything, if you're downloading a free program, you want to really be careful, you want to document you want to use it at your own risk. And if it's going into any kind of regulated product, you want to test the heck out of it. I mean, so that's where risk based, like my risk has just skyrocketed, because I don't know where it's coming from. You know, it's like somebody left a puppy on my doorstep, I don't know where it's from, could be vicious. Could be great. Gotta check it out. So don't know the pedigree. The more you know, pedigree, the more homework you have to do.

Dori Gonzalez-Acevedo:

So, so you're seeing this as a big trend, like, because it's also more of like Internet of Things to coming to more set software medical device, right? We're going to be seeing more and more of this as we as we move forward, right?

Unknown:

Yeah. And I was at a conference recently where this this came up as a thing where they were talking about the new edition of GAMP. And they were like, well, what do I do with these systems? And you know, my answer to this person's asking the question, I wasn't a speaker at the conference, I was just there as a participant, was, you really have to look at the level of risk. And I know that goes back. And I know that out there somewhere. I can hear people like and going, it's not just about risk, but risk is a big component. There's also, you know, that level of concern, there's like, what happens after we assess? So like, I thought maybe I'd take a few minutes to step you through kind of like one of those, what would Jackie do?

Dori Gonzalez-Acevedo:

Sure, that's awesome.

Unknown:

So when you look at the FDA is guidance on the use of off the shelf software that they brought out in 2019? You know, they're talking about using it and devices. But also, when I think about it, it's not just devices, it's companies, you know, any pharmaceutical company with you're using big data, bioinformatics, data analytics, software industry just can't move fast enough to pick up and cater to your specialized needs. So quite often, you're building your own. So the first thing to do is to assess and do like a risk hazard analysis, like you were saying, is this going to critically impact patient approx safety? You know, when I mitigate these risks, does it bring that level of risk down to an acceptable level of concern? What am I doing with my residual risks? Am I performing a risk review? So if I like had to take my first step, it would be I'm going to do this risk hazard analysis. I'm going to come up with a mitigation plan.

Dori Gonzalez-Acevedo:

But you're doing that at the system level of the of the soup, right?

Unknown:

I'm doing it because it's the soup and and the use, like how am I going

Dori Gonzalez-Acevedo:

Zubin? Right? But you're right at but still at the high level? Because I just want to differentiate, because when we start saying more risk based proach. Folks, traditionally think of that or hear that as FMEA, AES, which is not what we're talking about here. That is that is really specifically for medical device specific design considerations. So that's not what we're talking about here. We're elevating this conversation up higher to the use, and what is what is going to be impacted to the whole.

Unknown:

So what are you going to automate? You know, so what's this thing here for? Why are you even putting this piece of software? Why are you using this? Why are you building it? Is it directly impacting patient or product safety? Or is it indirect, like I'm doing data analytics to figure out, you know, where else I can sell this or, you know, something like that. So, you know, is it more of a commercial use? Is it more of an r&d use? Is it a definite like safety use? You know, is it is it going to be part of a submission to the FDA? My Risk is just up because I better get it right. Am I going to be using this as a decision to keep a product on the market? pull it off. Is it safety related? Is it part of a device? So what's the safety classification of the device? You know, is it like class one, two or three, you know, what level of risk is that device itself. So the higher the risk classification of the device, and its concomitant software that you're putting in it, the higher the overall risk of the project. So starting right there, and documenting that, and then going down to your system level assessment.

Dori Gonzalez-Acevedo:

Yeah. So one of the other things I'm hearing what you're saying is, is also part of the trends that I see in the industry as a big big picture hole, right? We are big B, we're being more complex than ever before, right? We're pulling together pieces and bits and stuff from all over, not just the proprietary information that a given sponsor is making. But we're also leveraging other parts of good technology, if we talk about AI, machine learning as well, right? Like all of these things put together, the echos the ecosystem of what we're building today is more interdependent on all of us. Right, rather than on a single company. And so the risk from a business perspective also jumps up, right? Where were where were, you know, a decade ago, right? A lot of stuff was all done in house. Right? Yeah. And now it's, it's it in order to innovate, right? To get to the next level, we actually have to innovate with partnerships with across a variety of things. Right. So our, our due diligence, it goes up and in some ways, you know, I don't want to get into legalese speak, right. But it's, you know, to put that the legal stuff aside, it's very exciting, right, because what you can create is infinite, but you need to understand, I like the word provenance, because it really does kind of tie into all of this, right? Like really understanding all of the pieces and doing like a heat map of where your risks are within that whole ecosystem. Yeah,

Unknown:

yeah. And it's, and it's not just with software. In fact, I just read an article recently. And I'll, I'll have to pull it and send you the reference. It was about literally open source pharmaceuticals where companies be cooperating with each other to develop, you know, like, software, develop something new. It's like, you've got a piece of the puzzle I've got I got

Dori Gonzalez-Acevedo:

a piece of the puzzle and right, and how can we partner together? Right. And I think that partnership is also a common theme in in how to do all of this moving forward, right? Because all of us have expertise in whatever thing that we make, or service we provide, right.

Unknown:

And part of this too. So going back to the software piece is, you know, as you're performing that criticality assessment, because you're pulling these disparate pieces, you need to be sure to look at the downstream impact on other systems, you know, are you is this super open source gonna be part of an entirely new system? are you improving existing system? What impact is this going to have potentially on, you know, product safety, quality availability, like supply chain, you know, is this going to affect your supply chain in some way, this is probably more have to do with software as a medical device, which is a whole other topic. But you know, basically, if you have an open source component that enables you, to bring this device to the supply chain, lets you to the virtual supply chain. Whereas if you couldn't do it, so it does completely affect that, that supply chain that and may fill a need for a life saving drug or device. Now, going back to my my favorite thing, like I said, is auditing I'm an audit, I'm an audit and process geek like I love writing SOPs, I love writing procedures, and I love auditing. The What if I can't, so who's gonna go out and audit the our institute like, this is like, you can't audit Amazon. You can't audit Atlassian. Those companies are huge. Also universities, things like that, they're not going to have the ability to have you do that. So it's a similar approach where you have to like pull the information that you can, like at Amazon Atlassian do an amazing job of giving you so much information, like I can go to their websites and get all the information, understand the intended use and everything. But if I've got like a widget that was built by a bunch of super smart scientists at some university, probably can't audit it. So then I've got this thing. So then after I've kind of assessed the risks of this thing, after I've kind of come up with what level of concern and what mitigations I've got to figure out how I'm going to test it. And there's a few things that are going to go into that. So availability of any documentation like is there any design documentation. Is there your anything I can leverage? Do they have release notes? Are there any tests that I can leverage? They publish anything? Do I get notified of releases? Patches? So me have to like talk to my tea department a little bit about, you know, how can we handle this? Can we like sequester thing in Sandbox? During upgrades? How am I going to, you know, making sure I have appropriate test environments for it. And kind of, you know, reviewing along the way, as I mitigate each of the risks, doing risk review, which is something that people don't do a lot of stuff, I can't audit it, I need to do a really comprehensive risk assessment and testing, I need to review risk to make sure it's really been mitigated by whatever measures I take, and whatever testing I do, to make sure that not only is the system fit for intended use, but it isn't going to break anything else in my ecosystem.

Dori Gonzalez-Acevedo:

Yeah, it's, it's exciting. I mean, I think that a lot of this goes to the How to continue to be creative in this space, right? Because here's the other thing, I know, we can't, we can't develop everything, right? A lot of companies, you have to be able to, and I know you and I've talked about in the past, like the trust factor, right? Like we need to be able to partner and be creative, and figure out how to trust those relationships enough to be able to advance our own ideas within the companies that we work with, right? And do. And so the combination of software as a medical device with a drug with the you know, like with machine learning, like all of that is is happening, right? There's lots of startups that are working on some really cool stuff. And if we as an industry can't shift and make use of that. That is just concerning to me, right? Like, like, what are we going to do about that?

Unknown:

Well, and I think it goes back to some of it, when you're thinking of using this stuff, it goes back to that CSA approach, that thing that's concerned people, the CSA and GAP care, how do I know what I'm doing is enough? You know, even I mean, everybody, we all have inspection, fear, even if you're an inspector, you have inspection. So

Dori Gonzalez-Acevedo:

I was just talking to a customer this morning around this and and, you know, and again, CSP CSA, I don't care what you call it. And I'm, I'm, it's just it's another term, in my opinion, right there, where we're talking about software quality testing, and good software quality practices. And if we bring it back to that, which many industries, right, not just the life sciences and healthcare industries, many industries do very, very well. Regardless of the words that we put on top of it as a regulated industry that we do. So how can we leverage those best practices which are being done in the world today, across the board, really cool. OT and it sort of stuff that requires much more important things that we need to talk about? Right? So maybe not today on this podcast, maybe we'll come back and talk about another time. But cybersecurity, yeah, much, much more important than, you know, down to some of the nuts and bolts of what we're talking about from a validation testing.

Unknown:

And we haven't even kind of touched on that. But of course, if you've got a well known application from a well known vendor, you know, and they're using, you know, Amazon data centers or something that's very well established, your cybersecurity risk would consequently be lower, then I'm getting an application from I don't know where, you know, to convert, you know, to create my my tool, or maybe I'm, you know, grabbing something that was built in Python or R from somewhere, I don't know, built it. I don't know if they put trapdoors in it that couldn't, you know, you don't know where you're what you're getting. So again, having that and having a cybersecurity expert in with, you know, talking to your IT department, if you know, cybersecurity isn't your primary wheelhouse. I'm definitely like a compliance and risk geek. You know, what I try to do, also in these cases, is have quantifiable assessments. So kind of going back to that whole, build it back in the 90s things back in the 90s. All age myself. We built everything when I was at alpha wasn't available. We had a huge team of software developers, because we had to build everything that we needed. We had to build our like manufacturing resource planning software, everything. And we had to test it consequently, but we knew where it came from, we could look at the source code. Now I can buy one, I, you know, know that maybe it has like what impact it's going to have, but I can't look at the source code. So I have to, you know, take that into consideration. I was going to say something about same thing with risk meetings. So back then I remember sitting in these like, very warm rooms, it seems like it always took place in a very warm room right after lunch, which is a combination of deadly things. CV wanting a coffee in a cookie, you'd be sitting here having people argue over these risks, and I was like, Okay, why don't we have 10 questions we ask about each and every one of these things, not that you can bring, you're going to break it down however you want. But 10 or 15, standard questions you ask is, is going to get used in a submission? Does it directly impact patient data? Patient 50 rather patient quality, product quality? Is it you know, gonna need to, you know, do a recall, you know, what is what are the things that are high and low risk and score. And then I've got this score sheet. So when I get audited, I can say, this feature was high impact, high risk. So you know, we're gonna test it more this feature, you know, it was the color of the interface, it has no impact on product safety, product quality, so we don't really care, we can test it less. And then multiplying that, so I take those risks and actually use math, where I take I multiply out my device classification, three to one my risk classifications, my level of concern of my residual risk, my what I know about the vendor, and I come out with the amount that I kind of need to test each thing. So I have like I've for myself, I've created a series of tables, where I, you know, look at the intended use the risk hazard analysis, level of concern, mitigation plans, risk reviews, then I look at my different activities based on like, if I have a major level of concern with a high risk piece of software with a vendor that I don't know, I'm going to require everything like a full audit of whatever I can get or a validate full validation plan, full test plan, error assessment, you know, user requirements, this didn't I can get him, you're a lot more testing a lot more verification that things work. Whereas if it's a low risk, you know, kind of like, like, it's running a piece of equipment, that's not even impacting my drug or device. You know, I can do less testing, I can have maybe not no documentation, but I can do more ad hoc testing, I can have a lighter weight test plan. Maybe I use more unscripted testing for the lower risk features lower risk suit, but for my critical stuff, that's for a class three, you know, major level of concern, I'm doing a lot of scripted testing. I'm looking at any things, any changes that come through, and I'm applying more testing to those because I don't know what downstream impact the software changes that that developer released to us our Yeah. So I don't know if that makes sense. I'm trying to encapsulate it without illustration. Because I know that we're Yeah, it's,

Dori Gonzalez-Acevedo:

it makes sense to me, because I live in in those matrices A lot. And I know the practicalities of some of what you're talking about operationally, do do hit a bottleneck. Right. And so my, my cautionary tales from an operational side perspective, is that we all need to be more familiar and more comfortable living in the gray, right? Rather than the black or white that this or that sort of bucket because you know, we're infinite, we have infinite resources, time, and space and to do everything is not possible. So we to identify the risks, at least have those documented where a business decision has been taken and made, right, so that you can then follow up on that and see, you know, a year from now, was your risk decision appropriate? Right, right. Because I think it goes back to also, you know, really proving effectiveness and Kappas, that's the same sort of thing. If we're not reviewing our risk evaluation, operationally in live production, use after feedback, we're not really learning and we're not, you know, evolving from a product perspective and our risk of management perspective. And that is required in and should be part of the process as well. And that's the part that I see often is forgotten. I see it forgotten with Kappas I see it forgotten with risk, I see it forgotten with audit, management and vendor management, right? We're really, we're really good at following the process of doing something the first time,

Unknown:

right, like one and done oh,

Dori Gonzalez-Acevedo:

what we're not good at, you know, you know, and I'm getting I'm making some broad statements where we're, we tend to lack on the follow up and the follow through, and the benchmarking of what we are today versus what we were 12 months ago. And, and so and then being able to rebase line on that and do that. And I think part of the resistance to that is, well, we have so much more to do, right. So the things that are coming in, are not slowing down, they're accelerating, right. And so having a system like rescue, be able to track the audit findings, the capa findings and have periodic ways to remind me, right, that this stuff is going on, in a way that adds value. I think it's critical.

Unknown:

Yes. And risk review. So see, and one of the things is, and this is you, right, like, it's one of my little pet peeves that people like, Oh, I did my risk assessment of the project. And then they're saying, Oh, we're gonna have a post post mortem at the end of the project, but then that like never happens. So you never learn from it. The other thing is that, you know, and you're going back to like, a lot of what people do with validation records, this is like across the board doesn't matter if it's CSA, CSV, soup, not soup. People create these things as static records. And then they just get shoved in document control, or any never like, you can't really reuse them. You can't really access them again, it's really hard and an inspection because you get three, four weeks ahead, and they'll give you 14 folders. Hey, Jackie, for each of those applications that you're you know, QA it for? Can you give me the records for all the changes? Well, that's where a system like rescue validation automation platform will really give you a lot of bang for your buck. Because first of all, everything's there. So I don't have to go searching, which saves anti Jackie a lot of time. Second thing is things are reusable, and your records are no longer dead and static. Like I come along. And now I've got my you know, release are 222, whatever. And I can look back. So here's what happened in r1. Here's the changes to our two. This is where I need to look at downstream impact. Is there any regression can I review any risks that came up? So my risk review is living like a living risk review. And we're gonna like in this, I'm a distance runner. And there's a there's a saying that if you wait until you're thirsty to drink, it's already too late. Yeah. If you wait to evaluate risk, until things start breaking, it's going to cost you so much more time, effort, compliance, risk, inspection risk, not to mention, like your little box of condoms.

Dori Gonzalez-Acevedo:

Right, because you

Unknown:

waited too long, you didn't. You didn't do your homework. So going back to the whole thing like your vendor audit, whether it's soup, if you can get one or OSS to your risk review, these are all steps and doing your homework and to be really honest, you're not reinventing the wheel every time so if you've evaluation automation platform that automates a lot of these workflows for you like rescue, I'm not having to rethink get every time. You know, I can plug in like I can have all my questionnaires in there and I can have that as part of my validation package. So that I know when I'm going to audit Dori that I you know, I've got like kind of a set of questions for her I can make sure all my my P's and Q's are done. Yeah, I'm not forgetting anything. I shouldn't say p's and q's. It's not a picky like I can make sure that I'm I'm you know that I've got a lot of that thinking outline so I can really concentrate on what's different. So I'm not doing the check the box activity every time I'm doing the critical thinking like, what's this for? Why am I using it? What's the intended use? Who does it impact? What workflows does it impact? Beyond the obvious and this is the other thing beyond my obvious user group. Right? Who else is it impacting? And I'll give you a like a little example. was working with a vendor it was a pharmacy for a highly controlled substance. And the pharmacy unbeknownst to the company I was working for, had changed Just there, they had run out of eight digit numbers for their serial numbers. So they've gone up to nine digit numbers. But in doing so, they did not update their own software. Consequently, they didn't let us know that we needed to update our own software. And I was doing just an audit, like a forensic audit to see what lots came back. And all of a sudden, I wasn't seeing lot numbers for like, several months. I'm like, this is weird. I know, we released drugs. And it turned out that when we did that, it came out that it was the vendor that didn't do their stuff, which told us uh, we thought we were doing everything right. The whole time. We thought we were staying up to date in our software. But because another totally disparate party didn't know, like, there wasn't that downstream impact assessments as soon as you have to think out of the box, because I was like, why is this broken? We can accommodate you know, these numbers, we can accommodate 20 digits, it was because the vendor software could only accommodate eight, and they've gone to nine. Yeah. So

Dori Gonzalez-Acevedo:

now that interconnectedness and and it's not necessarily sometimes malicious by any means. It's just that there's disconnects of not knowing how everyone connected, everyone's connected, right, upstream and downstream. Yeah. Well, Jackie, it has been a pleasure. Well, you're gonna have to do this again, because I think we can talk about a many other topics together for quite some time. It was a pleasure to have you today on software quality today, and we will stay tuned to next time.

Unknown:

And I'm looking forward to seeing you on the women and validation call this afternoon. Yes,

Dori Gonzalez-Acevedo:

absolutely. Thank you, our hair care. Thanks for listening to software equality today. If you like what you just heard, we hope you pass along our web address per seller x.co to your friends and colleagues. And please leave us a positive review on iTunes. Be sure to check out our previous podcasts and check us out on LinkedIn at reseller X. Join us next time for another edition of software quality today