Software Quality Today

A Critical Look at Computer Software Assurance (CSA): Is it Really a Thing?

July 01, 2022 Dori Gonzalez-Acevedo Season 2 Episode 3
Software Quality Today
A Critical Look at Computer Software Assurance (CSA): Is it Really a Thing?
Show Notes Transcript

A risk-based approach to Computerized System Validation has been around for over 20 years, so what's the hype about CSA? Join us in this episode as we are joined by our amazing guest, Dr. Bob McDowall, to explore this question.

Bob McDowall is an analytical chemist with 50 years of experience including 15 years working in the pharmaceutical industry and 29 years as a consultant to the industry and suppliers to the industry. He has been involved with the validation of computerized systems for over 35 years and is the author of the second edition of a book on the validation of chromatography data systems published in 2017 and a book on data integrity for regulated laboratories in 2019.

Bob is the writer of the Questions of Quality (LC-GC Europe) and Focus on Quality (Spectroscopy) columns.  One such column in September 2021 was entitled Does CSA Mean Complete Stupidity Assured? It is available online at:
 https://www.spectroscopyonline.com/view/does-csa-mean-complete-stupidity-assured-

*Disclaimer: Podcast guest participated in the podcast as an individual subject matter expert and contributor. The views and opinions they share are not necessarily shared by their employer. Nor should any reference to specific products or services be interpreted as commercial endorsements by their current employer.

This is a production of ProcellaRX

Dori Gonzalez-Acevedo:

podcasting from Alexandria, Virginia, just a few miles from Washington DC, where we all hope doing what is right the first time is everyone's top priority. This is software quality today presented by Purcell RX, a podcast about the trends and challenges of software quality testing and computerized system validation. And the people who are leading the way. Here interviews with special guests and news from customers and vendors. I'm your host Dori Gonzalez Acevedo and welcome to today's episode. Welcome to another episode of software quality today. I'm your host Dori Gonzalez Acevedo. Today I'll be bringing you a lively conversation that I had with Bob McDowell. Bob is an analytical chemist with over 50 years experience in pharmaceuticals as well as consulting. He has been involved with the validation of computerized systems for over 35 of those years. He is the author of a book on validation of chromatography data systems published in 2017. And a book entitled data integrity and data governance practical implementation for regulated laboratories published in 2019. Bob is the writer and editor of questions of quality in LC, GC Europe and focus on quality and spectroscopy. One of the columns that He recently wrote in September 2021, was entitled, does CSA mean complete stupidity assured? I'll have the link in the show notes available to you. So without further ado, please welcome Bob McDowell. Well, welcome, Bob. It's a pleasure to have you join software quality today. I'm really excited to be able to have this chat with you. Well,

Bob McDowall:

thanks very much Dory for the for the invitation there. I've it's a nice opportunity to have a nice discussion about computer system validation and computers, software assurance.

Dori Gonzalez-Acevedo:

Yeah. So I think hopefully, our listeners will really take to some of the topics we're going to talk today. I think you and I both are can can heat this up a little bit. Before we get to in there, because it's a very, very rich conversation for both of us. I'd love for you to take us just to give some frame of reference of who you are in the industry and how you came to be in the place that you are today.

Bob McDowall:

Okay. Essentially, I started life as a forensic toxicologist. So I have a I analyzed dead bodies for a living, and I have a PhD in death, which makes me the least likely person do you want to sit next to over a meal. I then saw the light and the additional funding available in the pharmaceutical industry where I worked for 15 years 10, of which at SmithKline when it was French, and five years at welcome. And then for the past 29 years, I've been a self employed consultant, but really, I'm a frustrated academic, but academia pays peanuts, and my wife would prefer some a better standard of living. But why did I come here? I was in the wrong place at the wrong time. And I if I go back to the 1980's, I went into my boss's office. And he said, you've got a few minutes, you got a bit of spare time. And I made the mistake of saying yes. So I ended up with a laboratory information management system project to run, which I enjoyed, and edited the first book on the things that we did, and more importantly, should have done but didn't. And I could start getting into computer validation when the software services manager came up with a great big grin on his face and says, Well, Bob, Isn't it nice to know if I screw up your limbs project? You go to jail. And I thought hang on a sec. This isn't right. And then I found out it was. So in October 1986. I attended a training course in Amsterdam for three days on computer validation. And I've been involved in computer validation ever since then. It varies from the sublime to the ridiculous. taking screenshots of just about everything under the sun to trying to be a lot slicker. earning from your last project and trying to improve that. So there's, there's a number of things there. So that's how I'm sitting here today.

Dori Gonzalez-Acevedo:

Great. Thanks. It's, um, you know, being in this industry for such a long period of time, right? We've seen a lot of guidances and documents from variety of agencies and you, like you said, you're a frustrated academic, but you are a prolific author, and you write, and you also part of ISP E. So can you tell us a little bit about all of your improvements over the years?

Bob McDowall:

Okay, well, I've, I suppose, since I was I've been doing my since I did my PhD studies, I've been writing articles, I was allowed both at SmithKline. And welcome to keep publishing and presenting. And as well as the limbs book, I've written two. I've written two books on validation of chromatography, data systems, and also another one on data integrity and data governance for the regulated lab. But with ISP, I guess I've been, I joined probably in the late night, middle 1990s, if only to save money on buying the publications, but then I've been involved with a good practice guide for IT infrastructure lab, second edition of the lab guide, and then over the past few years input and review into the record some data integrity guidance, plus also data integrity by design and data integrity, key concepts. So that's my input from there.

Dori Gonzalez-Acevedo:

Yeah. And so you know, ISP, for me is always the go to and one of the things that has been so helpful over the years is to be able to go back to a standard and industry colleagues that are kind of dissecting and putting out in in layman's terms, some things of that can be done in practice. And so I appreciate you and others that have done that for everybody.

Bob McDowall:

I think I think I'd rather correct you on that, if I may. It's it's, it's it's a guide, it's not a standard, as shown when the editor of the GAMP guide says, you can enter it's it's subject to further interpretation. And on occasions, I will do that. For example, I modify a GAAP category for lifecycle in the GAMP. Five guidance, and I condensed it down for some of the laboratory systems that I work with, in terms of validation, because you can streamline things a little bit more. So it's it's a guidance rather than a standard. It's a great new having said that, having said that there's a lot of people involved. And they're all volunteers, and they also get involved with with regulators as well.

Dori Gonzalez-Acevedo:

Right. Yeah. So I think Thanks for clarifying that point. And it really is one of the hearts of I think the the nuanced conversation between CSA and CSP, which we'll talk about, right? Is this this? It? They're all guidances. Even the FDA puts out guidances. Right. And, and it's how we interpret them and what we do with them as an industry that really matter. Right?

Bob McDowall:

Yeah. Yeah. I think the course the one thing that is interesting is you compare the regulations on one hand, and the guidances on the other and both contain the word should. However, in the more recent FDA guidances, it defines should as something that is recommended, you can deviate, but you need a good justification. And of course, on every single page, it states contains non binding regulations. Oh, recommendations. So it's one of those things.

Dori Gonzalez-Acevedo:

Yeah. Great. So I want to talk today about the article that you published, I guess in late 2021. Because is it September 2021. Yeah. So the title is, which I loved and I thought was very provocative and needed to be talked about was does CSA mean complete stupidity assurance, so Tell us a little bit about how you came up with that title. And what was your

Bob McDowall:

I think if the Genesis was if you if you go back in time to the the FDA is cave case for quality, they've been talking about CSV being a bottleneck. Now, again, I will most of my experience is in pharmaceuticals, and a little in medical devices. So, I have been involved in 21 CFR 820 and ISO 13 485. And also IEC 62304. Work in the past, but it is not the main part. So I would preface my comments. So, the work that was coming out of that seemed to suggest we use critical thinking we use trusted suppliers and everything else. And if you look in the publications from the FDA, in terms of what guidance is they're going to be issuing. It's come up, I guess, over the last three or four years, that there's going to be a guidance on software quality assurance, which is touted to be a complete replacement for computerized system validation, which, okay, we move on. And things evolve. And I don't have a problem with that. But then you start seeing a lot of publications from people presentations. And for a regulated industry, it says essentially, don't wait for the guidance from the FDA just do it. And that concerns me a lot. We are a regulated industry, heavily regulated. And because of that, it is really a situation that we want the FDA, even if it's a draft guidance, and of course, the some of the draft guidances are updated, probably a little slower than glaciers recede. Having said that they do they can get their act together. I mean, the party level scope and application guidance was draft and final within a year, actually less than that, probably eight months. But for the most part, where's the guidance, because that is the definitive issue. That's the definitive baseline from which we work. And you get people talking about this publishing it, which is fair enough. But when they say ignore things, ignore the guidance, don't wait for it, just do it, I have a major concern because it's being filtered by those people, you cannot see the definitive guidance from which you can draw your own conclusions. So it comes FDA, a person through either presentation or publication and filtered down to you. Now, I'm not saying that the people are malicious or or anything like that, or have ulterior motives. But I want to see, based on my experience, I want to read that guidance, and I want to interpret it myself. So I became increasingly concerned that because the FDA, in my view, have been, have actually abrogated their responsibility of actually getting the guidance out. I wanted to write about this, because from where I'm coming from, actually, I don't think you need CSA. And I have been writing two columns, one, both for analytical magazines. One is questions for quality column, which I've been writing for 29 years now in LC GC, Europe. And my focus on quality column in spectroscopy magazine, which has been going now for 22 years. And the i because they tend I tend to want to write to get people to think whether you agree with me or not. I'm not worried. I don't want you sitting on the fence saying well, possibly. You're either going to agree with me or totally disagree. Right. And so it's written in that sort of provocative style. It's also written I would hope in a fairly blunt Well, it's very sarcastic

Dori Gonzalez-Acevedo:

It's extremely factual, like the details of which you lay out the timeline of guidances that have been previously written what they state all of those things. What I was very impressed with with the article in general was how factual it was and how balanced it there was on both sides of the argument to then want to, you know, argue,

Bob McDowall:

I think, because we are a regulated industry. And let's let me let me be totally honest here. Having having written books, and educate and educate people on data integrity, I've got to be very honest here for 15 years, and I worked in the industry from 1978, to 1918, sorry, 1993. I've had worked under GLP, under GMP. And I never once read a single regulation, it was always interpreted for me. But being a consultant, you suddenly realize, hang on a sec, I've worked in two companies where there were two totally different interpretations of the same regulation. Therefore, I have to go back to basics. And what we find even now is, people do not read the regulation. And therefore, if I write something based on my opinion, you've got to be able to derive that opinion, from the regulation, or the guidance, and then say, this is my opinion, you can't just come out and say, This is my opinion. And that's where I think I'm, I'm concerned, where there's no guidance from the FDA on CSA. Now, to come back to your original question about how did I get the title? Well, let's, let's go back to the 17th century, where the father of chemistry Sir Robert Boyle, talked about this skeptical chemist? Well, I'm probably the 21st century, the skip the cynical chemist. So I wanted something that would attract people's attention and get people's to say, hang on, what's this idiot talking about? And to go down into more detail? I gather, it's, it's, it's it's created some interesting feedback, most of it I've received directly has been very, has been very positive. I'm fairly certain there are a few images of me floating around various organizations, with large leaders in them.

Dori Gonzalez-Acevedo:

I wouldn't necessarily say that, but I, you know, being

Bob McDowall:

got a fairly good impression by

Dori Gonzalez-Acevedo:

being a consultant, as long as I have as well, right? Every single client that I deal with has a very different view or lens in which they interpret right, so So to your point earlier, is now you see as a consultant rather than being in a company, right. And of course, there's those that that jump from company to company, and they take their philosophies and ways of being with them. But for the most part, consultants get to see a very, very wide spectrum of how to interpret and what companies think about the regulations, right. And one of the things I'm often asked for is, given that I see all of that, what do I know that works better or not? Right? How does one apply these principles in a way that makes most cost effective, most, you know, balance, risk approach, all of these sorts of things. And to get to some of the heart of the things that you talked about, like least burdensome approach, like that has been a thing forever. And now, suddenly, it's like popped up on on the CSA kind of radar, but that has always been the intent of the regulation from day one. So why why did

Bob McDowall:

you go back to 820? It's in 820. And in the preamble, toit 20 as well as in the general principles of software validation, which is 20 years old. Yeah. And I think the there's an I guess, where I'm coming from, as I say, you under the article was trying to put in fairly. What I was hoping clear terms, is why we don't need it, because we already have the regulation. and the guidance already there. And that's the important thing, if you've got it, and and this is, I think part of the problem that as we come through the, if we go back to 1993, when I first started consulting, I went to a company that shall remain nameless to protect the guilty. But I was presented with two filing cabinets that were six foot high, two meters high, two meters wide. And there's two of them full of screenshots. Now, I'm thinking, why? Because there's some requirement for witnessing and everything else. This is the, if we go back into the 80s, this is what was started. Yet, if you look in the regulations, the only requirement in GMP is for copying the master batch record. On the basis that you don't want to make five tons of drain cleaner, you want a fine pharmaceutical product. So it's risk management where it's needed. And I think there's also a reluctance to take risk, especially with QA, and also with some management. Now, let me give you an example where I was working in under 820, and ISO 13 485, we were implementing a learning management system that was going to be ultimately having a quarter of a million users on it. And I went through the regulations. And I said, here's the rationale. You don't need to sign electronic or electronically sign your training records, you just need attribution of action. Right? So we do the nine week validation, and we come up to the last week. And on Monday of the last week, the guy I'm working with gets an email from the Vice President, what's this, you're not putting in electronic signatures, we sign our records, training records. Now, we're going to do it in the new system. Now, this is stupidity on stilts. Management, not understanding the regulations, and being stuck in what I would technically call the Middle Ages. And so what happens here is that we now have four days or three and a half days to understand how electronic record electronic signatures work. Write the test script to demonstrate that it works update the URS, the traceability matrix, the risk assessments, all the other things, implement it, and write, test it and write the validation plan. We just about made it.

Dori Gonzalez-Acevedo:

Yeah, I see this still today. So by the way, so I still see this activity happening today. I also see auditors insisting that software vendors have training systems, quote unquote, validated and part 11 records. It's really absurd. Exactly, to your point. And so, but how do we how do we have? How do we change this? If that's if that's what's out there? And that's been for decades now. Right? I think part of this CSA movement was a hope to, to spark some sort of conversation. Yeah, this practice, right?

Bob McDowall:

I think, actually, I think this is where ISP and the camp forum come in, because they have published a document, a guidance on enabling innovation. And the one section in there was agile development. And they go to town in that section, to say, Look, you don't need electronic signatures, you need attribution of action, all this stuff. And if you've got development using JIRA, or DevOps on equivalent sort of application, you have all the it's so easy to audit, and it's a doddle. And if you put the gates in where you can't, you can't test until you've done a peer review and mocked up all the comments. It's beauty Cool.

Dori Gonzalez-Acevedo:

Yeah. Yeah, I laugh so I drive a Tesla, right. And I get stuck for a while you over Wi Fi right? I fairly certain they don't use part 11 signatures and software development, and yet I entrust my Tesla to drive me and my children, you know, to impro where they need to be. So I feel like this risk conversation also comes circle back around as well. Right. It's like not being willing to have a nuanced hard conversation about risks associated risks within certain systems. Certain systems are extremely risky, right? There are the ones that are absolutely directly impacting to product quality, safety, patient safety. Yeah. But then there's others that are not

Bob McDowall:

at all, yes, I can't agree. But even those that impact directly product quality, because of the nature of the software, you can take a simpler view. And this was something where we developed an approach going back 1314 years now, where we condensed the whole of the validation down into a single document.

Dori Gonzalez-Acevedo:

Okay, I've done the same thing to with my customers. Yeah, that allow me to because it makes sense, right? Why have all these 567 different documents when you can just summarize them? One little short thing?

Bob McDowall:

Yeah. And so what you have, and if you go back and look at the regulation, it says intended use. So you take from 211 63, intended use, and you define just intended use requirements.

Dori Gonzalez-Acevedo:

I think folks struggle with that, though. Right. I think that understanding what intended use is, and what that means, in terms of a predicate rule is still very, very hard to define. For some organizations.

Bob McDowall:

The easiest way to get around that is to actually draw the process out

Dori Gonzalez-Acevedo:

a process now and what novela Yeah,

Bob McDowall:

well, actually, the best piece of consulting I did, I met the guy in April, that I worked with a company. And the we did a two day process mapping. And then, a month later, another two day redesign. And the redesigned process to use electronic signatures, has been running now for 18 years unchanged. It has gone through several upgrades in the software. But it was just for one site. It is now six business units globally, the same process, and I'm thinking, wow, best piece of work I've done and we publish it, we published it in 2005. We didn't I would if we did it again, the mapping would still be the same. But the level of testing would be a lot more reduced now. Because we'd be leveraging from a trusted software supplier. Right?

Dori Gonzalez-Acevedo:

Because we weren't. Right. So I want to hold on to that trust, because you were you raised it a couple of times in your article, right? There's a level of trust, there needs to be right there needs to be a level of trust with your software vendors. There needs to be a level of trust with your testers, you highlight as well. And I wonder if you can talk a little bit more about that, because I wonder if that's part of the QA conundrum of terms of when when to release some of their angst is around this trust issue.

Bob McDowall:

Okay. Well, let's, let's look at it from the you're you're buying a system and you've got to us you've got to assess the supplier. So what do people normally do, they will send out a supplier questionnaire and the supplier will fill it out. You could file it. I would always verify one or two key elements there, especially if it was a gap category three software. But that assessment is only part of the job. In my view as you get into a category four system. I would do a one day a assessment of software development. And you can do this remotely or you can do it face to face. If anyone who's listening has a as a software supplier in Hawaii, I can do a very good deal for you. Okay, I suspect Dory living a bit closer will do a better deal. But that's beside the point. But the point, the issue is, how do they develop software? Do they use a waterfall model? Do they use requirements, specs, all this sort of thing. And what you find with some companies and and when I, I am also a trained auditor, and I spend my life when I'm looking at software development, trying to tell people to do less than do more. Because what you find is that they have a software development based on a JIRA or DevOps or an equivalent piece of software. And then you get the pharmaceutical auditor in, where's my requirements spec? Where's this? Where's that? And these guys are working around with the engine room. That's perfectly adequate. And then they're putting in all these bells and whistles that basically says, we want this, we want that which are totally unnecessary. And that's where I think auditors actually bring or some auditors I should say, because Can't we obviously have to exclude us too from that. Bring in a degree of separation, you need to be able to separate the people that say, I don't care what you call it, it's do you do it? And the key thing that an auditor has to ask whatever they're auditing is, are these guys in control? Do they have? Can they demonstrate that they're in control, and we have a traceability from where they start, be it a requirement spec, a marketing spec, a user story, an epic, whatever you want to call it. And I can go through the whole of the lifecycle. And they've done sufficient. If they've done that, and you don't have to take a lot of time, you can do it remotely, you can do it on site. But if you have a report, and this is where you need to start to differentiate what is a, you may have a category for application. But many of the functions within that are actually category three, you only parameterize it

Dori Gonzalez-Acevedo:

yeah, there's a great a great distinction I when I was rereading your article, and how you talk about that is because the majority of tools that are purchased on the market today are made for general consumption. Right? They're made for a variety, a host of industries across the board, JIRA, you've mentioned, you know, all of these things. They're not made for Life Sciences, specifically, right, like, so let's be real. But what they do do well, is they're highly configurable. And they're configurable in ways in which a life science company can use them. And back to your data mapping, sort of conversation processing the customer, the company needs to understand what they're going to use that for, right? And then figure out what is important about that it's not on the software vendor, to to say, Oh, well, by the way, you guys only want to use this for GXP systems, then this is what you should do, or this is what you should do. These are bitten you multi national companies here. It's up to the company that what you want to use it for really matters, right? And if you want to go through this extra due diligence of writing formal, Urs isn't FRS isn't all this stuff for for functions that already exist that have been tested gazillion times by people that know their stuff inside and out. And if you think you can do better than the people that actually developed and was paid to do that testing. I don't know. Yeah, kind of arrogant

Bob McDowall:

still need we still need a user requirements spec or some form of spec at the front end, which defines their intended use. Now, in terms of managing risk and intended use, let me go to a lab example of a chromatography data system. You look in the CDs, and you have somewhere between 10 In and 15 different calibration models you could use, which ones do you actually use, you document those in your urs. And you also and you then, as part of your assumptions, exclusions and limitations you document, I'm not using these, and they're excluded from the validation. And I think the I use assumptions, exclusions limitations a lot, because when I first started compute computer validation, there was no camp guide. And the only thing that we could find were the Institute of electronic and electrical engineers, software engineering standards. Now, if you're building a nuclear power plant, or you're you're putting in, you know, a microprocessor chip factory, you're gonna be up the top end of, but we want something that is fit for purpose. So you take these standards and documentation, and you adapt them for what you want. But in the standard 829, which is software test documentation, there's a test plan. And I still use that to this day. And in the section 6.3 of that test plan is a section called assumptions, exclusions, limitations, or better known in the trade as alibis excuses and lies. But I won't say that publicly. Now, what you're able to do with the these assumptions, exclusions and limitations, you cannot test everything. And you're sitting your approach on top of what the supplier has done. So if you've done your assessments of the software development process, all the functions that are essentially GAMP, category three, you're not going to test you may have to use them indirectly, to when you validate your workflow. But that's the way you're going to be. You're going to be trying to construct things. So it's a matter of what have to supply a test, you won't be able to do everything, because you've only got a day. And it's an audit. But you have those functions. And you also have your assumptions, exclusions limitations, do you test all the different combinations of say, user role and the access privileges, because that will keep you tied up for a few, a few weeks. These are the sorts of things you can try and reduce. And it's trying to keep both a level head of trying to see what you can do. And the other thing, and I've I've been I've been caught up in this is don't customize change the process to match the system rather than change the system to match your crappy process.

Dori Gonzalez-Acevedo:

A little bit more about that. Because that's one thing that over the last several years, I've tried very hard to persuade not to have happen. But there's always those stragglers that say, oh, but we have to have it our way. Right?

Bob McDowall:

All right. There's a Scottish comedian called Billy commonly. I know, Billy, yep. Yes. And he has a class of people called stupid but saveable. Okay, so you have to persuade those people to not do custom software development, because I've done it, and I know what the pain is. Okay. And once you do it, and if once you if you do it once, you don't do it again. So let me go back to last century, when I had what my wife would call a proper job. And we were implementing the first limbs I was involved with at SmithKline. We had to take what essentially was a sample driven limbs and take it into a make it into a protocol driven limbs, and we were quoted 14 weeks of custom software. It took 151 weeks. We then had to validate it. And then of course, you find the problem. I know what it says in the specification, but what we really want is this. And of course because it was a fixed price contract and the supplier has lost a lot of money. You're going to pay for this through your nose and it's not just going to The Peanuts, it's going to be every single time you want to update the system. Because it's mostly hard coded. So my second system was, yes, we'll have a protocol driver, it will be part of the standard system, I will not accept custom code. So that's it. So my view is always look at your process, and where at all possible, change the process to match the standard system, because you'll be a lot easier, you won't have custom code to upgrade and link with any changes. And you can upgrade a lot quicker and a lot easier. Rather than leaving it to when you get the dear esteemed customer letter that says you've got nine months left before your system's dead. Right. And that's, that's really the way I would, I would like to go, it's changed the change the way that you work. And I've got examples where you look at a process, and you've got two or three ways to run through. And that's not counting the undocumented processes, that people sneak out in confidence when you're doing the mapping. And then, of course, if you automate that, it's going to take a lot of time to automate. And it's going to take even more time to validate. And that's the big problem.

Dori Gonzalez-Acevedo:

Yeah, I mean, trying to have a company. Now, again, this is excluding the companies that are software as a medical device, or folks that are actually making software product for Life Sciences, which is a whole different category. Yes. But we're talking about for those applications that are readily available off the shelf, you know, most configurable configurable everyone that that most companies already have, right, because if we look at the life science industry, a lot of the companies I would say, I would argue have 50 60% overlap of all the applications that they already purchase, right? So this one has the same as this one has the same as this one has the same right, there might be a couple of different flavors in there. But for what we're talking about, it's usually pretty much the standard same. So no customization is really, truly required in order to get your work. Yeah, we also see I see that a lot in SAP world, right? Where,

Bob McDowall:

yeah, well, I'll leave you to do suffering and pain. I'll keep out of that. But the the one thing, if I can come back briefly to configuration versus customization, you have to be very, very careful. Because marketing authorize organization or marketing departments of certain organizations have discovered, we can't call it customization. Even if they give you a language, it is always configuration. So if you go to the GAMP guide, and look in Appendix M four, it says if you go to vendor supplied language, treat it as category five. So some limbs are going to be in that that sort of situation. As long as you control it looks fine.

Dori Gonzalez-Acevedo:

And tested. Right? Because you're gonna test that configuration by forced it. Yeah, exactly. Yeah. Yeah, great points. And with that, though, also implies a very natural, critical thinking skill set that goes along this whole entire thing that we're talking about, right? Oh, yeah.

Bob McDowall:

Yeah. Now, I talking about critical thinking. I mean, I make the point in the in the article, I was auditing a clinical system a few years back. And I was looking at the test script, and it started in November. And it finished in February, and I'm, why is it taken so long? We were waiting for the password to expire. Yeah. Now, the normal way of getting around that is that you would normally set you've got say a 90 day standard password, which tends to be industry standard these days. Unless you've got a few people that want to be a little more lazy and push it up to 120 or even 190. But say 90 days, you would normally in the test script you would go in change the configuration to one day, wait for it to expire. But these days, that's not seen as a good idea because you are now messing around with the configuration. And in today's data integrity environment. That's a total no no. But look and see. How How does the computer actually determine mean time, you have a trusted time source that sets the time of the clock. And how does the computer work? Well, it's got a little pizza electric crystal that vibrates around and the computer counts, it converts the number of vibrations into time. And guess what? It's on every single computer. So if you want to wait 90 days to test the expiry of the password, that's great. On the other hand, if we come back to the assumptions, exclusions, limitations, why you're testing it, if you've got a trusted time source, and you've got a pizza, electric sell on every, every single system, what's the point? So it is easier to exclude. But document it, that's the important thing. Because all of this is important to put down to say we're not doing certain things, or we are doing certain things. But there are limitations. And it's that thought process that is really critical. And this is where I think the these assumptions, exclusions limitations come in. Because if you go back to 1970, where the US military, were asking Barry Bohn to predict what software was going to be like in 1980. And he says, I got some good news and some bad news. So they said to him, what's the good news? There is none. He said, The bad news is the software situation is gonna get a lot worse. And he actually gave him this report, a, here's a diagram, simple diagram of some software. And if you could test one way through this pathway per nanosecond, and you started when Jesus was born, by the time this report was published, you're might be halfway through testing it. Now, management's rather unwilling to allow that amount of time to test software. Oh, and by the way, you said, this is a simple program flow segment. There's something like 10 to the 21 different pathways through it, you can't test it. So what are you going to do focus on what is your intended use? Yeah, and

Dori Gonzalez-Acevedo:

so but to your point, so this, this kind of critical thinking about looking at each system, doing that analysis, writing up your exclusions, writing up your limitations, all of those things, one take time, take, you know, perhaps different perspectives in order to get all that documented? Where in your article, and I also have seen this over the years, right, this kind of bucket tising approach where we want to have this tailored checklist and do it the same way every time sort of mentality, rather than having the conversation. It's it's almost as if that, that it's a it's a people don't know how to have that conversation. I don't know, maybe not know how to facilitate that conversation. What are your thoughts?

Bob McDowall:

I was at the first face to face meeting I've been to in two years in Italy, at the end of April. And somebody in the audience said, we are when we're implementing some automated training records, and the QA department want to have everything signed electronically. And I said, you want to fire the QA department because they don't understand the regulations. Tell them to go and read the regs. So the first thing, read the regs, read the guidance, understand it, and then you start to work out from there, what you really need to do that makes it so much easier. And so much effective. The the time, what I would say is documenting these assumptions. Exclusions limitations are actually relatively straightforward, because as you start to design, some things are going to be fairly straightforward. You're going to exclude pulling the plug out of the back of the computer. Right, right. Unless you're testing unless you're testing a UPS.

Dori Gonzalez-Acevedo:

Yes, I agree.

Bob McDowall:

But there are other things as you go down and you start to write say, well, actually, I could do two or three things here. But what really is my intended use, and you document it at the time you write it Your test script. And that's where I think it is very important that you, you keep aware of what you're trying to do. And keep in mind the reg, the, the the requirements that you've written. I think you're slipping out of the windows there. Oh, sorry. Am I putting you to sleep?

Dori Gonzalez-Acevedo:

No, it's just my half sitting half standing at my desk throughout the day? Yeah, I think it's, um, you know, having clear requirements is also an art form. Right? Yes, writing, writing good requirements, writing testable requirements, writing requirements that are technically understood in a way that, again, if we're looking at general software, use some of those requirements are technically easy to do from a from a software vendor perspective, not really understood from a from a end user perspective, right? And really differentiating the two. Clearly. The other part of that, though, I also, when I was reading through your article around, I still see folks wanting to do the full FMEA sort of, you know, assessment on on standard software. And that kind of worries me. Because it just doesn't make any sense. And what rather really taking a critical look at their user requirements from from a risk perspective, and really understanding why why did Yeah.

Bob McDowall:

It's interesting, and I don't wish to criticize the gap forum, but I will make the comment that they've even put FMEA into the certainly into the first edition of The Good Practice Guide for infrastructure compliant control and compliance. Now, I never use that because doing failure mode effects analysis in an IT environment, when you've got a lot of standard stuff. On the other hand, there are two very good, it based risk assessments, the first being an old British standard 7799, part three. Now, the first two parts have migrated into ISO 17799, and then into 27,001, and 27,002. But the 799, part three is has got a risk assessment in it. There's also a NIST Special Publication 800, either 31 or 41, that has it related, and they focus on what's the value of the asset that you are managing. So if you have to have a batch of product where it's $5 million per batch, you want to make certain that your cybersecurity and all the other stuff back up is right point. And you've only got and that's only one batch and how many batches, how many batches of data, or you're keeping your data for registration, and you've got a billion dollar a year product. So these are the sorts of things so it defines the assets. And then it starts to look at the vulnerabilities. What have you got in place? or what have you got in place? What are the vulnerabilities? What do you need to do? And you can conduct one of those risk assessments in half a day with a small team of people. And you don't have to worry about an FMEA where the Martians are landing next week.

Dori Gonzalez-Acevedo:

And this makes a great point, because one of the things that I try to help my clients with is really, you know, getting to the data itself. Right. And I think it's something around when you talk about and you lead your workshops on data integrity, right? I mean, what we're what we're reviewing and approving matter, right, like, and it's not just necessarily the process that we go through, but what what is it right, so while a learning management system need not be, quote, unquote, fully validated in this very old school traditional way, but what's probably in some ways more important is the content of the training that is being done and delivered, right, and who is reviewing and approving the training itself, right, rather than just posting trainings and all that sort of thing. And so getting back to the focus of the actual data, right, yeah. is really where the heart of the whole conversation I think needs to go.

Bob McDowall:

I think I think that's a good point. Because if you look at the way that computer validation started, it's top down. I think I always take the view that there's no point in buying a computer system and implementing it unless you get significant business benefit out of it. Otherwise, give the money to my daughters and my grandson, and they will waste it far more effectively than you ever will. But the point is, so you get business benefit. And you think about compliance coming along as a secondary objective. So look at the business benefit, look at the process, the business benefits from that. And then you start to identify the data, and how you manage it. What the GAMP forum did with a now out of date, good practice guide is that they had in 2005, a party 11 I think it was party lemon, compliant party level records and signatures. And what they tried to do was a identify the records and then they did a bottom up approach. But the problem with a bottom up approach is you never get a look at the process and the process efficiencies. So it never really took off, what I would advocate is really both a top down and a bottom up approach, top down from the process, look at configuring the software, and then look at the data and look and see what vulnerabilities are. control those. And then that's what you end up validating. And you should and again, the streamlining the business benefits. Because if you spend six months a year, I know companies spend even longer than that validating a system a system, you don't get an efficient business process. You've wasted all your money.

Dori Gonzalez-Acevedo:

It's a great point. So to that point around we talk about agile project management and agile deployment as a methodology, right. One of the things I've been advocating for a lot now is, you know, this NPV sort of way. Right? And that, you know, we can we can validate and control things along the way, we don't have to wait this 678 months process that many of these organizations want in order to see the business value of what it is that we're going to want to do, right? Because it may not be you can you can't assess every possible scenario. But what you can assess is some of the best, the quick hits or the the things that will improve your process immediately. Right? cliquish quick wins, get those in place, put a process together, test those an intended use, and then start using the system right away, rather than having to wait like some of these approval cycles or two, three weeks long in order to get you know, a requirement spec or a summary report approved? And is that really adding business value to the critical quality attributes that that the companies actually want to document and maintain and do metrics on? And so I feel like it's a counterproductive and, you know, argument and how does the business line the business process owners? Right, really justify waiting or letting letting project teams wait in a waterfall approach in that regard?

Bob McDowall:

Yeah, I think I think the number of signatures on a document need to be very much curtailed, you need technical or other technical review, technical approval, compliance, review and release maximum of four. And in fact, when I get down to come down to test scripts, I try and negotiate upfront with QA, to have just two signatures on the technical review, or technical content, technical review and release. The end. The review, the post execution includes a QA review, but different to many people. I have an overarching test plan, which is approved by QA before it goes. So QA have a good oversight. But there's not a lot of value that a QA signature has on a test script. Agreed when you want to move quickly. And if I come back to the issue I had with my learning management system a few years ago, where we had to implement things very, very quickly. Having just to sit He has enabled us to write the test script, execute it, and then have it reviewed quickly. Right? Within that short period of time.

Dori Gonzalez-Acevedo:

Do you think it's a fear of failure? That that holds our industry back?

Bob McDowall:

No, I think it's a fear of it's basically it's a cya. And if it was, Okay, last inspection, it's okay this one, and they don't think about the see in cGMP. That's the biggest problem. And if you changing the tack slightly, if you look off the CSA, Arctic CSA side, and look at some of the warning letters that are coming through, FDA are getting really cheesed off with industry. You look at the station and tender warning letters from July 2020. You look at the BBC warning letter from August of last year. Here, you've got company bit with BBC you've got instruments where they have the ability to store electronic records, but the company didn't use it. They just print it out. And they were cited for that both in raw raw materials input. And in finished product testing. Station and tender were absolutely screw to the back wall by the FDA, with two different parts of the CFR cited and identical word for word remediation. And it's pages of it.

Dori Gonzalez-Acevedo:

Yeah, I find that very disheartening, when I hear any client wanting to take a PDF of an electronic record and slot to another system. I mean, the the amount of more non compliance areas that they make by not using the systems as they are designed to be makes them more risk for compliance risks in general, right. Yeah. So what do we do from here, Bob? Like how do we, how do we continue to help an industry that that, that clearly, I think, wants some help? Because I think the part of the CSA movement in general, people want something, right, they want to be told something they want very, it's like a prescription. Right. And for seller X, my company is prescription for software quality, because I feel like that that's what they really want, they want a very clear roadmap of what they should do. But at the same time, that's not what the regulatory bodies want, at all. Right?

Bob McDowall:

I think the key thing here is you need to have a flexible approach to computer validation, you cannot have a one size fits all, you've got to be able to say, Okay, if I've got to, if you look at the the the lifecycle models in GAMP, five, seem to become second five, Second Edition. If you look at those models, the category three is basically specify, build, test. And therefore, that's where you can condense virtually all of that into a single document with traceability, with assumptions, exclusions, limitations, and a pro forma report, etc. If you and that's where I think you need to be able to have that, I think people should use prototyping a lot more. Because this says, I've got an idea, I've got a generic spec to select this system. But that's generic. And this is system X version of why I gotta be able to understand how it works. So I need training, I need management to say, Okay, you're gonna get training, and you now need to play with the system. And in my view, for a large system, you really can't do it. Part time.

Dori Gonzalez-Acevedo:

No, you can't. That's right. And if you don't, it takes a long time to really understand that system in its entirety, to then get the enablement and adoption across an org large organizations that we're talking about, right? We're talking about very, very large organizations. So you need that time to play without the constraints of all of that the heavy heaviness of the validation that's under

Bob McDowall:

that's where I would see if we come back to CSA that's where I would see and documented, you play around with the system, you look at it, you see what it's like. And from there you write your second version of your urs that reflects what the system is and how you're going to use it. And you play around with the configuration setting so that you can really get a far better understanding of how things work. Once you've got that, then you can try and reduce the amount of testing by assessing the supplier. And of course, this is a two edged sword. If you find that they're working on sealing wax and Sue string, you got a problem, or I should say, backs of envelopes and undocumented testing, then you've got a major problem, but most companies don't do that. So even if you don't have a formally assessed or certified QMs, if there is a QMs, then I will be quite happy. Providing it meets certain requirements, then you can start to reduce the amount of testing by saying, Okay, if it's a category four system, how many of my requirements are actually category three functions? And then from there, focus it on what is configured? And don't forget, in 211 63, it talks about adequate size. And what have you got to do? If you look at a GLP? It's even better because 5861 talks about adequate capacity. So where are your pinch points? Because you need to make certain you've got enough capacity or size to handle those pinch points.

Dori Gonzalez-Acevedo:

Yeah, that's a great point. Because in software quality testing, performance testing is actually really super important. And often not thought about when we talk about CSV in general, because it's not one of those things that you check off on the box from a traditional CSV model perspective. But when we're talking about software in general, performance, availability, these sorts of things, especially as we shift towards a SaaS model, lots of these these these systems that we're talking about, right? So where insecurity, right security and performance for SAS is much, much more important than whether or not you know, that particular feature is in its full function, you know, you might be beta for a feature perspective, but if their security and their performance is not up there for the size and volume of what you're going to do, it doesn't matter. Right. So yeah, all right. So Bob, last words, last thoughts?

Bob McDowall:

I think, I think the one thing is the last, the last thought would be I would encourage people to go or companies to go back to basics. Look at the regs. And if you look between the lines of the party level scope and application guidance from 2003, one of the things that it did say was look at the regs, not in so many words, but it says where the predicate rule says, and that says, go back and read. So back to basics, look and see what it says, look at some of the guidance documents. And I would advocate in the absence, the general principles of software validation, front ends, some of the development side of things will probably be out of his 20 years out of date. But there is there are publications on the use of agile, I think there's a joint FDA, or Taylor from medical devices was involved in one of these AMI publication,

Dori Gonzalez-Acevedo:

T MRI, T MRI or a MRI. I have the link here. We'll add it to the show notes. Yeah.

Bob McDowall:

Yeah. I think that wood looks at agile in a for a regulated era or developing software. And I think it's really trying to look at what you want out of suppliers and out of systems and part of that would be retrain your auditors because some of the things that they're asking for, and as I work Not just for the industry, I also work for software suppliers selling into the industry. If an auditor asks for something, just say, Okay, I can't understand why you're asking for this. Show me what it says where it says in the regulation. Is it the auditors opinion? Or is it actually a regulation a regulatory requirement? Or it's in a regulatory guidance document? Show me. And I think the one thing is that we are the software suppliers are more interested in getting a sale and therefore they start to roll over. I think they need to push back a bit more politely, not aggressively. But to really do things. And in coming back to companies really assess what you do. Do you need screenshots for everything? No. You need. You can use the system and the audit trail to self document and validate the audit trail. It makes so much easy sense. Yeah, keep it simple.

Dori Gonzalez-Acevedo:

Keep it simple. I love it. Well, Bob, thanks so much for sharing your time and thoughts with us. I really appreciate it.

Bob McDowall:

Thank you for inviting me. I have enjoyed our conversation. I hope I haven't monopolized

Dori Gonzalez-Acevedo:

it. Not at all. Not at all. I welcome always the the conversation and I appreciate you taking the time across the pond. So it's late at night for you and staying up for us. I appreciate that as well.

Bob McDowall:

Okay, all right. Thanks very much.

Dori Gonzalez-Acevedo:

Well talk to you soon. Take care.

Bob McDowall:

Okay, that's good.

Dori Gonzalez-Acevedo:

Thanks for listening to software quality today. If you liked what you just heard, we hope you pass along our web address for seller x.co to your friends and colleagues. And please leave us a positive review on iTunes. Be sure to check out our previous podcasts and check us out on LinkedIn at reseller X. Join us next time for another edition of software quality today