Inside Out Quality

The 1983 Air Canada Incident and Design Controls

February 23, 2021 Aaron & Diane Season 1 Episode 7
Inside Out Quality
The 1983 Air Canada Incident and Design Controls
Show Notes Transcript

Diane and I explore an Air Canada incident where a flight ran out of fuel at 40,000 feet in the air. Retired United Airlines Captain Sam Biondo joins us to talk us through aviation safety and this incident. Diane and I are also joined by Joe Ostendorf, a medical device regulatory consultant to discuss design controls and how they can prevent your product develop from running out of fuel.

If you are interested in reaching out to Joe, he can be reached at [email protected]  

 

Aaron Harmon:

Hi, I'm Aaron Harmon.

Diane Cox:

And I'm Diane Cox Welcome to Inside Out quality.

Aaron Harmon:

both Dan and I build an implement quality systems in the biotech and medical device industry. But we often get asked, Is this really necessary? Do we know if we are doing too much too early? Or do we even need a quality system?

Diane Cox:

Our goal is to explore questions like these through real life events and experiences shared by our guests from various regulated industries. We will show you why quality is not just about compliance and how when it's done right, it can help your product and company improve lives and make a difference.

Aaron Harmon:

On July 23 1983, a new Boeing 767 flew across the skies of Canada piloted by Captain Bob Pearson and first officer Maurice Kintel, along with six crew and 61 passengers, the day before the plane sat in Edmonton. during a routine check all three fuel gauges were found to be blank. One is an auxilary the other to measure the fuel on each wing. The technician found that by deactivating a circuit breaker the other breaker took over and the gauges were back online. The technician tagged the bad breaker and marked it inoperative. He then put yellow tape above the gauges saying see the logbook. This was marked as a snag. The gauges all worked. There was another procedure that could be used to confirm fuel levels, so the pilot flew it safely to Montreal. In Montreal, a new captain took over for the next flight. His name was Captain Pearson. During a brief chat in the parking lot. The previous pilot explained that there were issues with the fuel gauges, whoever there should be enough fuel to make it to Edmonton. A new technician reviewed the logbook and performed the maintenance check of the airplane. He saw the entry in the logbook and decided to get to the bottom of the problem and ran on board diagnostic test. For this to work he had to reactivate the faulty breaker. Again the gauges went blank in the cockpit. He tried to replace the breaker but there was none in Montreal, so he ordered one from Edmonton. He then got distracted and failed to deactivate the faulty breaker like the previous Tech had done. And now the gauges stayed blank. Captain Pearson got ready first flight and notice the tape breaker and blank gauges. He checked the rules for minimum equipment list and saw that at least two gauges must be working. There is some confusion to what happened next. But Captain Pearson thought he was allowed to fly the planes till they confirm the fuel level using drip sticks. However, fuel weight measurements used to be in pounds and now they were in kilograms. The dipstick requires a conversion from centimeters to liters and liters to kilograms. The first part was easy using tables in the plane to convert the centimeters to liters. The conversion of liters to kilograms use specific gravity. This is where the last problem occurred. Historically, pounds were used and the conversion factor of 1.77 pounds per liter could be used. Captain Pearson and his team use that number instead of the point eight to convert liters to kilograms. They had calculated the fuel incorrectly by a factor of 2.2. mean they only had half the actual fuel they thought they had at 41,000 feet in the air they lost fuel over Red Lake Ontario. Only a magnetic compass, artificial horizon airspeed indicator and altimeter remain to glide the powerless plane to an airport and Gimli a number of things went wrong. But one thing went right. Captain Pearson was a glider pilot, he and his first officer glided the plane to Gimli. And no less loss of life occurred. This is an incredible story of small errors having a big impact. To help us understand. Here is Captain Sam Biondo. Welcome to the show, Sam.

Sam Biondo:

It's good to be here.

Aaron Harmon:

So the first question I wanted to ask is just how did you get into aviation?

Unknown:

Well as just like falling off a log for my life. My dad was in the Air Force fought in three wars. And when I was very young, the Air Force was very young. And so we lived on Air Force bases. And when I was like five years old, I can remember me and a lot of my 567 year old friends would crawl up between behind the forest behind our house and up to the flightline. And it was chained off with a chain link fence and we would just hook our fingers to that chain link fence and watch the airplanes go by. And then later on in my life, my dad would bring me models and not the kind of models that you put together, although I did plenty of those, you got the real models that the Air Force use when they you know, in their, in their squatter and buildings and he would hang on my ceiling. And it was I don't think I ever thought any other thing would ever be my job. I thought that's where I was going to go. And then of course I lived my teen years during the Mercury astronauts, you know, in the 60s. And so then immediately, I was enthralled by becoming an astronaut and being one The Mercury astronauts. So I decided that's what I was going to be, I knew were out ahead to get there. Um, eighth, ninth 10th grade, I put myself on a path to go to the Air Force Academy. And that was, that was my plan, I was going to be an astronaut, there was just no two ways about it. And so I went to went to the academy classes 74. And of course, the Vietnam War was still going on. So they didn't really need astronauts. They need fighter pilots. And so I was going to become a fighter pilot. And as you can see, through all the stories, just I'm just, I'm just rolling along with the flow. It's just there's no conscious decision to become an airline pilot. It's just I'm, I'm just flowing with what, you know, what was happening. And I, I started my training and fighters, and then they realize that, you know, didn't the war's over, which is awesome. I'm glad. I'm glad I didn't have to go. But now I had to look to another job. And I was in pilot training. And I saw one of these graduates from a class ahead of me, they got a C nine and a C nine is a DC nine. It's a McDonnell Douglas jet that is used in commercial aviation. And I could all the all the instructors, were looking at that kind of sci fi, you know, I go, Well, what is the C nine. So I looked it up and go, Oh, this is awesome. And I just gotten married out of the academy. And now I started to realize that, okay, I want to be an airline pilot. And so I went off to the Air Force, and I flew, probably the best job you can have in the Air Force. I flew aeromedical, air evac aircraft, which, during peacetime, there can probably no no more rewarding job in the Air Force, than to actually have a job saving lives and caring. Basically, it's a gigantic medical aircraft, where you pick up burn victims and all kinds of well, just goes to I just saying it was a very such self actualizing job. And then I decided I was going to get out. Most all of my friends, and all of my cohorts were getting out of the service, because it just wasn't the place to be. And put out plenty of resumes to several airlines, nobody was hiring at the time. But I was lucky enough to get hired by two or three different airlines. And I just decided to take the very first job that was offered to me. And I went with that. And I went to a small company called New York air, and was a brand new startup airline. So you can imagine the the both the thrill and the heartache of being in an airline that really was just going with they were just trying to figure out what they were doing. Then we had the Paco strike that almost put us out of business. I can remember because I had been an instructor in the sea nine in the Air Force, and they flew sea nines, I was put in a very many, many different roles that have a lot of responsibility to very young age, because I had instructed so long in it, I had more flight hours than any of the older guys. And so I learned quick and fast, you know, as it was, I was 2027 28 year old captain. And then the rest of it was progressing through Continental Airlines. And then finally the merger into United Airlines, and then the final merger into retirement. That's all you asked. You asked for time it was I told you how to build a watch.

Aaron Harmon:

Oh, it's good. Very good background. First time we talked. I asked you about the Gimli Glider incident. Oh, yeah. And you started recalling lots of pieces of information that was years ago at that happened. When that happened. As a pilot. Is there some kind of broad announcement that occurs? Like how did you find out that this had actually been an incident?

Unknown:

Well, it's not like I have an encyclopedic memory or anything like that. It's just a pilot memory. We all all pilots are curious about any incident that comes to the if it comes to the press, we want to know why. I mean, because there but for the grace of God go I we do not want to be the guy that duplicates that error, and so forth. And so virtually all incidents and I'm not just saying accidents that are fatal or anything that like that. Virtually all instances like that. Pilots are very interested in and we research but we also have in the the major airlines now not so much back in the day, we had a very, we have a very intense safety program in the major airlines now and they have safety bulletins. And on several different layers, not just the incidents, but every little thing like losing an engine over the North Pole, which is something everybody wants to be. Okay, I'd like to know about that, you know, but we have a very intensive program to get the information to you. And it comes at you from several different ways. And then of course, there's the pilot end of it, you're pretty much responsible for making sure that you're educated all the time. You there's never a situation or time that any pilot can feel comfortable unless he studies many of these, these crashes and errors and incidences. And then of course, they have a training program. On top of that we have we used it was initially started in the mid 80s. I came up before it happened. So before it existed, so I saw the difference between before and after we had programs where they would bring us in and show us many errors that pilots had made a different airplanes and different crashes. And then we would analyze it intensely. And as a group. And and then we would garner a certain amount of information in our own mind as to how that happened, why it happened. What would you do to prevent it happening to you? And, and then there's a lot of networking. We call it networking now it's just scuttle. But we're pilots in the cockpit. Do you remember that crash where this happened? And then you talk about it, and you bounce it off of other pilots to get a a mirrored view of some other professionals view of what happened to see if you if your viewpoint and their viewpoint mesh in some way. And what what learning did we get from that? And so when you brought up the Gimli Glider i We don't call it the game recliners just it was when a peg 767 Air Canada crash, you know. And so I knew quite a bit about it. I'm only in the sense that you catalog in your mind so many of the things you wanted to take away from what happened. And so when you mentioned it, then of course, this there the other parts about you know, it being on a abandoned runway. Well that's important because the only reason it was on that runway is because the first officer been in the Canadian Royal Air Force and he knew exactly where that place was. That was just convenient and well not convenient. It was fortuitous, and probably saved everybody's life because of his his knowledge of that, you know, and and we you know, we took away that okay, the the captain was a head head glider experience, much like Sully Sullenberger, you know, I knew a minute I knew Sullenberger. We his classmates call them Chesney, because his real name is chess. These long as they Chesney Sullenberger at the Academy was a year ahead of me. He was a sailplane, pilot and instructor at the Academy. And a lot of people don't know that because when you lose both engines in an aircraft, you're flying a glider and is a completely different aircraft and completely different responses. And so it was very fortuitous that the captain on the Gimli Glider happened know how to fly a glider. You know, it was very, very important that those things came together, you know. And I think we also when pilots talking I bet you if you were to ask five pilots at all, they don't tell you they remember it. I mean, maybe the younger plan, I don't know, pilots my age, would all say yeah, they ran out of fuel, he miscalculated his gas, he took off with an airplane that he that he really shouldn't have trusted, you know, without having any indication as to what fuel load he had. And now we're that would pretty much be the takeaway that most pilots would have from that crash. So yes, it was we remember that we remember crashes that make huge changes in the industry. Air Canada's DC nine crash in Cincinnati where they caught on fire and 23 minutes later the airplane was burning on the ground. gave everybody every pilot a shiver throughout the industry as to how fast you could burn up in an aircraft. And that was the memory We brought back from that, to change your motivation as to how fast you have to do stuff in an airplane, if there's a fire. Now, that's just an example, I can bring up probably 10 other examples that change the industry,

Aaron Harmon:

you had talked about accident chains, and how they can breaking a chain is key to stopping accidents. Could you elaborate on that?

Unknown:

Absolutely. It's not just in the airlines, it's any incident that goes bad is generally not just one thing ever, that causes that to happen. It's probably several things that have happened. And none of them have been corrected. And they all link together in order to cause something that we don't want to happen. And that's not just in flying, it's driving your car, it's working at your job, as a as a, a lathe operator, every single situation, it's a several different instances, each of which could have been remedied. All of which could, just by breaking that one link, stop the accident from happening. And so we're taught the mental viewpoint of a chain, I think we all recognize that it happens in real life, that it's not just the fact that you mopped the floor, it's also the fact you decided to wear slippers, you know that, you know, when you fall in crack your head, you know, it's, each of several of those different things could have changed, and you would have not cracked your head.

Aaron Harmon:

And that's definitely true with this airplane crash where it was fuel gauge was faulty, because a technician had forgot to take it out of service, or cause to be reset. And then the calculation on the fuel manual calculation was incorrect. Mm hmm. And if any one of those had been stopped, then it's meant to have happened. Oh, I could

Unknown:

see. I could count probably eight or 10 things in that accident, that were errors, and many of them were just in inadvertent mistakes. Like, several times, if you look, if you when you're reading the write up on many of the situations Who were you see the response of one person to another, they're in their lack, not like a communication, miscommunication. Where or an assumption made that was incorrect. Those are all part links of the chain. The captain's communication with the fewer the end, I'm sorry, the first officers communication with the fewer the captain's assumption after talking to the pilot that brought the airplane in bit, by the way, that's, that was one of the thing that keyed I mean, big red flag came on with me. We're taught just because an airplane flew in doesn't mean it can fly out. And that for many, many years was the assumption, you know, their plane flies got here. So it's obviously going to fly. No, no, that's not the case ever. You have, you're on a completely different flight. Now. Everything you do, starts from this moment, not from anything that happened beforehand. Now we try to document it as best we can. And that documentation, by the way, there was another error, you know, where individual does not put in a circuit breaker expects a circuit breaker to begin. That's not that was another link in that chain. You know, many of them were inadvertent, some of them were just straight up mistakes. We are always taught. And I always taught my first officers and when I instructed, I thought other captains that there's a bottom line. And there's a there are there are thresholds that you do not cross. And in order to ensure the safety of your flight. Many of them are not written down. Many of them are just gut feelings. We're not when I would if I were sitting in a room with a bunch of pilots, we're all talking about that. Excellent. I think all the captains would have gone I'm not taking an airplane with no fuel gauges. I don't care what you say. That's that's the bottom line. We're not I don't care what you say the airplane can do. I'm not taking an airplane that I can't immediately determine my fuel load. You know, there are it, there are many of the chains, or links in the chain can be broken through knowledge of the airplane experience like it's never really mentioned in that in the the narrative of the story, that there are other things in the airplane that correct for those errors. The weight and balance is changed by the fuel load on the weight and balance is being monitored by the aircraft. The the ICAST system, which is the in a boy it's been three years is every time the engine indication and cockpit alerting system explains to you what is a safe range for the airplane to take off in weight and balance wise, which is incredibly important. And the more fuel you put on an airplane, the different weight and balance because they're the wings are swept back, as the wings fill up with fuel from the inside out. They what happens is the fuel goes to a different location, aft. And so the airplane feels different and experienced pilot, this captain and probably no other captain in the in, in, in their airline because the airplane was new to them. You will know if an airplane has fuel loaded or not, when you take off, it's not just that it's heavy, is the fact that the weight and balance changes the load on the on the nosewheel it helps it determines how fast you rotate as you come off the runway, an experienced pilot is going to he's going to feel that he's going to say this is different. And he's gonna add that into his his decision making. Okay, when we took off, do you remember how late it was on the nose because that would that seems like there's something going wrong here. Also in the system, we have the ability to watch every one of those beam, if you remember in the article, it talks about the fuel sticks that they had to go out and stick the airplanes call. So you have to go out and it has a little magnet and it you you run the mag Oh, the main mechanic goes up, he runs the magnet up and takes all these different little fuel sticks, and any any. And that determines where the foot level is in each of the tanks as they're split up in the wing. And the I guess system will actually show you the fuel sticks and show you all the way through your flight how much fuel as a volume but not as a weight. It will It will then compute the weight. Now of course, they probably had that in wrong too. They probably were going with leaders instead of gallons. But it wouldn't matter, you'd still see that, wow, our outside sticks on this wing are empty. They should not be empty right now. You know, but um, that that's Monday morning quarterbacking I'm not. I'm not saying that any of us would have gotten anywhere, except for Gimli that we don't took off with the airplane he took off with. But there's many, many other things that that can interject to break those chains, you know, break the links and that you know in in that in that scenario?

Aaron Harmon:

Do you ever have any memories of breaking a chain in your career?

Unknown:

Well, absolutely. I've had I've had my chain broken by others too. Don't we have a saying piloting is the constant correction of error. And so and you're making a lot of them, you know, and so when you fly with four pilots, and in my case, I was very lucky as a senior pilot, when you're flying a triple seven at United Airlines. You're flying with four pilots because we're flying long flights but every one of those guys is senior so senior that they have been captains on other airplanes. So there's four captains in the cockpit. So if I'm making errors and I do when I make them there's three other sets of eyes looking at what I do to not Sam let's take a look at that again you know and so there they would fix that um I've I've broken my own links in my chain if I guess you'd probably ask him for specific instance. Okay. I discussed with well in this would probably apply a little bit more to toward the accident and Gimli in this that nowadays, your mechanics enter everything electronically as to what's going on in your airplane. And we get a printout in our paperwork more flight planning as to everything that's happened in that airplane and when. And you're responsible for going through the vast majority of that, and pages and pages. And, but in the old days, okay, it makes me feel really old that back when you just had a, a log book, and in that log book, that were a write up would be written up in one page, which was then pressure sensitive to two other copies, one copy went to the base that that you were in, another copy stayed permanently in the logbook for you to read, the third copy went off to be logged into the maintenance log that was then sent to the chief dispatchers in Houston. So the airplane was always being followed as to what was wrong with the aircraft. And I had a really good friend, and I'll just mention his name, because he was a great pilot. And he was, he's still a great pilot, he just retired. Tom Bernardo, a good friend of mine, we'd flown many years together, because as a senior pilot, it was my first officer, he took off with a grounding item an item though the airplane should never have taken off with. And the reason he did that is because he on the logbook, in order for, we don't want the pressure sensitive pages that right, we don't want them to write down to the other pages believe that. So we have a little flap of cardboard flap that goes underneath that. So that when you write on the top copy doesn't go through all the other copies and make them you know, where you can't read them. And the the flap was there. Well, the captain previously have written up a grounding item, it was a serious grounding item. But not something that would be obvious to the pilots in the cockpit. And it was under the flap. And so he read down to the flap and sign off in the aircraft took off and then found out when he landed that he had flown with an airplane that should not be flown. He was disciplined by the FAA, the FAA, find the company, it was a pretty serious problem. Well, having that happen, and having him tell the stories scuttlebutt wise and as what happened to him, I would, on every airplane ever flew. Even though you only had to go down to the last copy, I pull that flap out and look down underneath there, to see if there was another write up under there, you know, hidden that I didn't know about. And I would tell my first officers when I was flying, I'd be pulling a lot book out and be reviewing with them, the things that we have to be concerned about, buy this airplane and this and by the way, let's look under the flap and see if there's a grounding item. Okay, so I'm always a teachable moment something you tell all the pilots that you fly with, and then they will tell all the pilots that they fly with. And pretty soon you have a herd immunity against not looking under the flap, you know. So I pull that flap out and over the course of my career, to other times, I save my own ass, oh, wait that part out where I flipped that flipped it open. And there's the right up that's going to ground the aircraft right in front of my very eyes. You know, that saved me, you know that one little thing saved me twice, you know, where I would have been Tom Bernardo times to, you know, only because of that one small item. And this I think it applies to the Gimli thing insisted the write up situation and the way they transferred the different information from the, from the mechanics and to the technician that was working on the airplane to the pilots was an error. It was one of the links in the chain. And it could have been broken by all three of them.

Aaron Harmon:

So I've hardly any experience ever flying besides a few little pilot lessons and trying to imagine being at 40,000 feet and losing all your fuel and just having a big glider. Yeah,

Unknown:

I know. I want to thank you.

Aaron Harmon:

But as a pilot, how do you

Unknown:

keep releasing? Uh huh. Well, it's interesting that we, in the narrative and and I was reading in WikiLeaks, you're talking about how we're not trained for that? Well, yes, we are. We're just we're not trained for losing. All of our fuel. We're trained for losing both of our engines. You know, and and trying to get an engine restarted obviously, this isn't going to happen because the the fuel was gone. You know, I had some questions about, you know, if I'm Monday morning quarterbacking, because there's a lot of warnings that occur on the aircraft, based on just the volume in the wings as far as how much fuel Yes. And when those warnings come on, it's serious, it's time to go and get on the ground. Now, you have gas, you can still the airplanes designed basically, to the it'll suction feed fuel, even when you lose like the electrics. And there's still fuel in an airplane, especially when you tilt the nose down to descend, all the fuel is going to go to the, to the pump that's going to be most necessary in that in that situation to get you on the ground. So I, I'm not sure why there wasn't a more sooner response to their to the, to the fact that they were running out of gas. But you'd have I mean, I'm sure they asked him and he probably had, he seemed like he had, he was thinking ahead of the game the entire time. So I have to say that he probably knew what he was doing. Um, the fact that they have lost the fuel made on whether you're never going to restart the engines. The airplane is designed to have two engines not running. It has a bit the engines are still turning. Because of the airflow. There's just nothing driving fuel into the engine that's going to make you have any thrust. And then you have what's called the ram air turbine, it's it's a it's a windmill, that falls out automatically when it senses that both the engines are not running properly. And it generates it has a priority, it wants to give you electricity first. And then it's going to give you some hydraulics if you're lucky. The airplanes that Boeing flies 7677. They're not like the airplanes of law. We used it when we control the aircraft and in like a DC nine. If you move the cockpit up, I mean, move the controls in the cockpit. There was a giant, gigantic li long wire that went from your controls through turnbuckles all the way back to the tail to the stabilizer to the flaps that move those those surfaces physically on the 7677, triple seven and all electronic airplanes of today. They don't do that. It's I don't know if it's really for public consumption because I'm not sure how many people would feel safe if they thought you make a you make a move in the cockpit and it sends an electronic signal to those surfaces and then that electronic signal is translated into hydraulics flowing to the surfaces because the surfaces are very large and the pressure on the air on them is very large. So you need to you need very much power. And hydraulics gives you that power in order to reduce the change in the the surface on those wings and on the tail. And on the elevator. The rat does not have the power to do that unless it has a great deal of wind velocity unless it's very fast. And I mean give you hydraulics it's going to give the electrics, which is good because you're going to need them in situations. In order to be able to use your emergency instrument landing system to be able to get get to the ground. If you lose your fuel, your engines are never restarting. We were trained to go through a procedure in order to try to restart your engines and simulators all around the world. They're training their pilots, what happens when and don't think it can't happen. I don't want to alarm anybody but you go through a volcanic ash cloud, your engines are going to stop you know now you have to figure out how to clear them. Get the engines run and get it going again, you go through birds you may lose both of the engines in the crash on the Hudson with solely you know he's not going to get those engines back. But you can lose an engine art in our triple seven engines would eat that flock up and all you do is you'd smell it in the end you'd smell it in the air conditioning the engine on a triple sevens the size of the fuselage at 727. So no amount of birds is going to stop that that engine you know, but um but there's going to be things that are going to happen like fuel contamination, or we had a talking about MYM thinking about other airplanes at British air. A had both engines. We thought maybe they were They've frozen up with their landing, I think in London, and both engine started coughing sputter because of the fuel filter had clogged on both of them. And and I think they lost, they might have lost the engines restarted it. Anyway, we went, everybody started to going okay, well, what happened there? Why did what why do we have ice? Where did we shouldn't have ice and why how do we lose both engines? But in a situation where you lose both engines from fuel that's a prop. No. Now you now you have to do exactly what those guys did, by the way. Kudos to them. They did everything right to get that airplane on the ground. I was very impressed by the fact that he understood that it's better to use your glide angle. And what we do is called a lift over drag Max. Pilots are very versed by the way in the in the how the airplane flies, not just flying an airplane. But what are the aerodynamics of an aircraft we know when we're gliding, that there is a spot the sweet spot and airspeed that will get us the furthest point across the ground with the least amount of gas, it's called the L over D max. And he flew, he knew exactly what he knew it right off, he knew exactly what is the to fly to get to where he's got to go. But he also knew as a glider, you have to be pessimistic, you have to think I'm not going to make it is better to miss long and Miss short. That was my best I think of all the things I was most impressed with. That was the thing I was most impressed with with that Captain that he knew you cannot land short, you'll find a way to get down, but you'll never find a way to get there if you can't. And so he was long. Now he had to cross control the aircraft across controlling an aircraft, okay, you have to explain that. When you push the rudder in it, yours the nose, if you pull the ailerons up or down, it makes the wings go back and forth, up and down. If you pull the aileron out, it makes the airplane turn. But if you y'all the nose is going to go straight. So if you put the two of them against each other, you're still going to go straight, but your wing is down. Well, by doing that, you're going straight. But now you have all this drag out there, because you have all the you got the aileron, doing what it can do, it's got the rudder out there doing what it can do. So it's dragging the airplane and slowing it down and making him where he can get down faster. That's why cross control the airplane at the end. And as a glider pilot, um, you know, he knew that he knew how to use the, the controls to his best advantage, there's still going way too fast. But what are you gonna do, you know, if you make if you can get that airplane on the runway, and he did. You know, anyway, you can. It's a it's pretty impressive thing. And so I was really impressed by how we did that. But to say nobody wants to lose two engines at altitude. But in our training, you're expected to get them back, you're expected to restart them, you're expected to find a way to get at least one of them running again, in order to be able to get to wherever you have to go, especially in a two engine airplane, like a triple seven when you're three hours away from from your closest airport that you can land at sometimes. So it's really important to get that that one engine running, you know, um, but we always pilots have a saying it. And I say pilots, I mean, every pilot says this, because it's better to use your superior judgment. So you don't have to explain to us your superior skill. You know, and you put yourself in a situation where now you have to use all this great fine, where it would have been better if you just use good judgment in the first place where you didn't have to do that. Hope that answers

Aaron Harmon:

it. No, that's very good. So the last one, everything you were describing, talks about preventing things from going wrong, and relying on teammates to do that recognizing change that you can break. That works great in aircraft and the aviation industry that has a lot of lessons that I think we can learn in the biotech space. And so the biotech space, we implement a lot of quality systems to check. And we're trying to check and we're trying to make sure that methods don't go to go awry. It sounds like a lot of those lessons are applicable there any other lessons that you would say out of aviation that may apply to other industries?

Unknown:

Well, yes, but I, I preface it by the fact that I think the question that you asked, should be reversed In that, I think the things you do in quality control are the things that we as aviators need, or have used in order to enhance our safety, it's safety issues are basically quality control issues. So yes, they apply because they're pretty much the same thing. Um, there are so many things that you need to be introspective about in the entire system. In quality control, you have to see. Okay, I'm gonna, I have a friend of mine, for many years ago, when I was in the Air Force. His name was Rob Milburn, he became a doctor, but he was he used to play C 130. In Vietnam, which is just a flying emergency situation, because it's got four engines that always break it down, everything's wrong with it. And, and so I was his, his first officer in the Air Force, we called co pilots. And he taught me so much, but one of the things he had the he had these rules, he would say, you know, it goes, because when everything's breaking, and you're in combat, you think you have to think these things first. And one of the first rules he said, When he had an emergency an aircraft, he goes, Is it really broken? You know, am I doing this? Or is the airplane doing this? Is it broken? Or is it just, you know, I'm misinterpreting it? Is it? Could could it be just the fact that there's a circuit breaker pop that or is there something out there that I'm misinterpreting, before we go on with all the stuff I'm about to do? You know, and he always had, that was the first thing, the three rules were, is really broken. Secondly, if it's broken, do I really need it, and if I really need it, is there any way around it. But in the quality control business, it's all the things that you do. And the layers that you use, are based on introspection, of what you're doing first, and then looking at what went wrong. And then reverse engineering to prevent it. We do this, that's what you do as a pilot. I can remember situations where when I was a check airman, for instance, the danger of groupthink, so we'd have a bunch of check airman, and we'd all come up with the same conclusion. And the conclusion was wrong. But how could that possibly be with the best pilots in the Air Force, I mean, or best pilots in the in the, in the United Airlines or Continental Airlines, and we still get it wrong? Or we have a institutional bias? Or we wouldn't look into how we've been doing this all this time? Is it really right to do this? You know, those are all quality control issues. Really. Here's an example. I'm flying, I spent 15 years I guess, on the on the triple seven, and most of that, you'd be flying over the North Pole and over Russia, which I never thought I'd do as an Air Force pilot. Cuz, you know, you fly out Russia, you're gonna get shut down, you know, but so we're flying over Russian and, and different air spaces around the world based on their own country codes. They'll either use for their altimetry they mean how high you are in, in the air, many of us many, like the US uses feet, as our altitude 33,000 feet, many US leaders, okay. And as such, when you fly from one to the other, you have to adjust and an international group. Right after World War Two, the ICAO, the International Civil Aeronautics organization came up with a chart. So the pilots flying from one air one system of feet, and then transitioning to another with with meters could adjust as they crossed that border so that they would so that both sides knew which altitude the pilot was going to be at. And there's a chart on our charts, it's it's, you just look at it read it. If I'm supposed to be 33,000 feet when I cross it should be 33,100 feet. Okay? And that is approximate to that altitude for them with meters. And of course, on the triple seven and seven, six, you just push a button. It'll tell you both. It'll tell you how much meters you're on how much altitude you're on. Well, at our airline, our chief pilot and his instructors decided that that was that when we were between two altitudes That would be wrong, because you can't be one or the other altitude. So we're gonna adjust it up to the next highest altitude 100 feet higher in order to be safe, you know, Now should we have done that? It sounded good, it was wrong. You know, now we're at the wrong altitude to have been. And one night, I'm flying across Russia. And it's very, very quiet. And the Russians love to talk because they're sitting there in their little cubicle. They had nothing else to do, but talk to maybe one airplane every 30 minutes. I'm talking to the guy. And I go, Listen, I'm pretty sure we're at the mountain to go. Is this and you know, that don't you use? Yes, we know that is you're always on the wrong altitude. And I go, Well, why? Why is that? Okay goes, Oh, it's just Continental Airlines. We know you do that? You know. I came back and I wrote that up, I go, even the Russians know, we're wrong. You know, our entire airline is doing Iran and have been for 15 years. And nobody ever said anything. You know. So you got the institutional biases, you got to find out the future. Those are all quality control issues. You know,

Aaron Harmon:

that's great. See, you had sent me a photo of a threat and air model. Yeah, I have it pulled up and there's a triangle across the top it says mission effectiveness. Underneath that is threats than errors. And then undesired aircraft state. Yeah. Could you walk me through that?

Unknown:

It's a little bit complicated when you're first looking at it. But basically, it's talking about the depth of your error. And what are you willing to do to get back to a state that's proper, and many smaller errors in a quality control situation, you may have situations where a pilot or a group of pilots think that it's no big deal. And so we just kind of consciously ignore it. It's, and sometimes unconsciously ignored, because we didn't see that's the way we were taught to do it, and but or wrong. And it'll be something that, like, there'll be situations where a pilot will decide that he wants to fly. A particular approach. There's, there's, there's a myriad of I can't there's so many different situations, I just want, I want to talk about the the those things, I'll talk about the fact that the states on that chart, get more and more serious, and there's higher and higher consequences. And we are taught as both captain and first officers to interject to solve those problems down at the bottom there. And we you'll have courses, we have Captain legacy courses or cocktail coordination classes, were first offs or time to chime in, you have to start using your the words that I'm not doing this, I am uncomfortable doing this, this is an improper procedure captain and you will not continue. And to the point where I'm taking the aircraft from you, I am now in control of the aircraft, we will not do this. That's not mutiny that's just there, you obviously seen something wrong, I'm seeing differently than you. We're going to have to return to the to a correct state. And each of those states are different. And they require different responses to get you back to a state where you're where it's appropriate. But you say that would never happen to me. Oh happens all the time. We have a pattern where you're taught as you're getting closer to a runway, that you have gates, which are these gates are basically things that are acceptable in the aircraft state and things that are not acceptable. Maybe your airspeed is a little bit high, your altitude is a little off, you're off course this amount. Well, as you get closer to the runway, those different parameters become tighter and tighter and tighter. But pilots, all of us will have an ego input to that. As to I know I'm high, but I'm going to make it. I know I'm high but I'm going to make it. Oh, I know I'm fast. And I know I'm high and I'm going to make it and you're trying to convince yourself and the rest of your crew that you're just fine. And you just keep going until you find yourself in a situation that can't be you've painted yourself into a corner with your own ego. We teach that we do not allow you to go through those gates without correcting them and there's certain places where you not only cannot correct them, but you have to, you have to stop your, your, you have to abort the landing, you have to go around, you have to go do something else come back around and try it again. Yeah, but there's the ego input that whoever slides it. Now I'm fine. I'm going to make it I'm not gonna, you know, because it's embarrassing going around. Yeah, better be embarrassed than to take an airplane onto a runway, going 2040 knots faster, you're supposed to emblem to your tires, and going off the runway, which has happened, you know, in our cockpit, coordination. classes, they actually have the blackbox data where they take a situation and replay it as a video of the aircraft doing what it is obviously doing, as it simulates a particular landing. And they'll pull out some really egregious situations where pilots do stuff that there's no way on this earth, I would ever let myself do that. But the point is, you're looking at somebody that did. He had to, he was trained just like you. He, he had, he had all the knowledge you had. It's not that he didn't know what he was doing. You know, not only that he was with somebody who also let it happen and didn't say anything. You know? So, um, quality control, to go back to the beginning of this question is the same. It's just a different name. As we do the exact same thing. We call it a threat. You know, error management, you call it quality control, but

Aaron Harmon:

it's the same thing, you know? Well, that was great. Thanks for being on the show. But I appreciate you having me. Now we'll take a quick break to hear from one of our sponsors.

Unknown:

Today's startups become tomorrow's growth engines. In South Dakota, we're entering a new stage of expansion for our biotech industry, and you'll want to be part of it. Hi, I'm Tony Johnson, Executive Director of South Dakota biotech, where the state affiliates of the International bio organization and we're proud to be leading a state that's driving innovation to feed, fuel and heal the world. South Dakota biotech is here to inform, to connect, and to advocate for our critical industry. Whether you're directly involved in biotechnology, or looking to learn more about it, we want to hear from you. Find us at www that SD bio.org. Now back to the show.

Aaron Harmon:

Joining Diane and I today is Joe ostendorf, a medical device Regulatory Affairs consultant. Welcome to Inside Out quality, Joe.

Unknown:

Thank you, Erin. Thank you, Diane, it's great to be here.

Aaron Harmon:

I have to connect some dots here, because I think both Joe and Dianne have asked me, How are we going from airplanes to regulatory affairs. In the case of this flight, the goal is to get customers from point A to point B and to do it safely. And the way this works in aviation is you have checklists, and a series of steps to go through in a regimented format. The reason that this Air Canada flight ended up at 40,000 feet without any any fuel on board was because those safety checks and those that regimented process had failed. There was mistakes that were made. That reminds me of design control. And so with design control, there is a series of regimented steps. And if those are followed a medical device development, you should be essentially doing the checklist to make sure that when your product goes to the FDA, and you're going into clinical trials, and all the customers, eventually, you have gone through all the steps to make sure it's as safe as possible. So is that a fair connection? Do you guys agree? Anybody think?

Unknown:

I completely agree, I find there's a lot of parallels between the aerospace industry. automobile industry as there is in the medical device industry, in patients lives, or people's lives are at stake in all of them.

Diane Cox:

And I was just gonna say from the design control perspective, it's not only that it's also the the connection into manufacturing controls process controls. When you think about things that need to happen on a daily basis to get the product from point A to point B, if anything gets missed along the way, there's a potential that even though your design might be great, the execution of that design and manufacturing may need some work.

Aaron Harmon:

That's a really good point for an accident to occur. There is a series of things that have to happen, it forms a chain, this happens and that led to this and so there was a faulty circuit breaker that led to the gauges not working. There was a miscommunication between the pilots. There was a misunderstanding and they fuel conversion. And when you lined all those up, that's where there was failure. Is there similar things that you've seen in the regulatory space, Joe?

Unknown:

Absolutely. It's never there's never just one major issue that fails, it's 1000, tiny, incremental and minute details or issues that lead that way. In my experience, every device and every device that's been on the market has, has had failures. Nothing is ever designed or developed in a perfect manner. And oftentimes those issues that arise in your design and development activities, specifically, even your verification or validation activities might seem smaller, incremental, and might seem to be resolved, but then later will manifest themselves in a in a larger issue. Some of them are observed in design, verification, validation activities, and some of them aren't, they go unnoticed, because you're not knowing you need to look for it. But once you get into the field, you discover that it is indeed a an issue. Years ago, I worked on a device that was a vascular implant that had collagen in it. And it was designed and developed via graft to go in into the body. And it was verified, and it was validated. And it was released. And it was on the market for many years, and worked very, very well for its intended purpose and intended use. And at one point, we got a complaint that the device had water droplets on it. And we had never seen this before. The package was unopened. And how did it get water droplets on it in a sterile, sealed package? Well, within its its shelf life and usable, usable life, we started investigating and you go down the path of when was it manufactured having lot traceability, where was it manufactured? And then where did the components come from, and because it had collagen, one of the products that make collagen are from sheep. And so I had to investigate the sheep farm in Australia that where we sourced the collagen and what they fed those sheep over the course of the sheep's life, to make this collagen for this for this implant and all that we couldn't find anything that was a smoking gun as to why these handful of packages had water droplets on them. Everything was within spec. As it turns out, when we this is one of my favorite stories about about medical device failures is, as it turns out in the investigation, We later discovered that in in the mid 2000s, the product the complaint came from Central America, in yet in the middle, mid 2000s. In Central America, devices were being hand delivered to hospitals on the backs on packs of burrows over mountain passes. And they the packs the panniers on these burrows did not have covers over them and the package was exposed to sunlight whereby it wycked some moisture out of the collagen that was inherently there, and was observed as these water droplets on the package, or excuse me on the product within the package. And the product was still safe, it was still within its intended use. It could have been implanted, of course but nobody wanted to. But who would have thought in the mid 2000s medical devices, class three medical devices made by a manufacturer in the United States with distribution worldwide sourced from from sheep, when collagen in Australia that it would lead to a burrow carrying it over. It's packed on its back in Central America. And so it's a long story. But point is to say, when we're verifying, validating it, nobody ever expected that that would be in its usable life, everybody thought it'd be in a sealed box, and it would be distributed. So that's even an example of something that while in that case, it didn't cause unnecessary or undue harm to the to the patient or wouldn't have. It was something that you didn't anticipate in your verification validation activities. So we can sit around and think about all the things that we anticipate, but we still have to plan for the unanticipated and just as I'm sure the pilot on this flight was pretty confident or he and his team were pretty confident with their with their mathematics and figuring of how much fuel they had. But they had to react then when it turned out that they didn't have enough fuel and and that's part of what we do. And that's one of the similarities I see.

Aaron Harmon:

Process of reacting is also controlled. Yes, you have a formal investigation process you have complaint or complaint handling you have an entire essentially step by step for how to manage to those things

Unknown:

and a health hazard team to review it cross functional team to have all perspectives weigh in this indeed

Diane Cox:

yeah you kind of already mentioned it Joe but I'm just surprised you guys didn't include that in your validation as a simulated you know possible scenario. You just didn't You didn't think to test shipping on a barrel

Unknown:

Yeah, all the plans scenarios

Diane Cox:

come up. We

Joe Ostendorf:

you know, we did of course shipping and distribution testing which is in a in a high temperature environment, you know, you plan for it. And this is obviously now part of the process. And part of the standards is is you plan for, you know, temperature extremes, and you have to label as such on products, you know, you plan for sitting out in Siberia, in a in an uninsulated warehouse for a month, and you also plan for sitting on the tarmac and Arizona in July, right, you kind of have to plan for the extremes. So and we tested for that and it didn't fail. The part that we didn't plan for was it being directly exposed to sunlight, whereby there was no barrier to prevent the sunlight from wicking that the water out of the collagen, there's a quick solution to that problem was we put a reflective film on the inside of the package to direct the sunlight away. And of course, the labeling already said keep you know, do not store indirect sunlight. But it didn't say do not transport indirect sunlight. So of course, then we had to amend that as well.

Aaron Harmon:

Sam had mentioned how when is that? An aviation incident? That pilots remember that they want to know what happened because they don't want to be the same person making that mistake? Do you think we do that in this field? Joe?

Unknown:

We certainly do. You know, I think and Diane can can attest to this, even better than I can. But of course, we know we have procedures, we have work instructions, we have templates, that all help us prevent these things from happening again, we document them, we amend them they're living, they're living things that help us try to prevent making these mistakes again, and it's almost sad to say, but one of the the most foolproof things that seem to work is is tribal knowledge. You know, all of us, carry with us these badges of honor almost of the war stories that we've shared, or the the bad experiences that we can carry with us to help prevent from happening again. Interestingly enough, just prior to me preparing for this podcast, I was speaking with a former colleague, we used to work together at a company and he'd moved up kind of outside of his his space in the science department into also being responsible for research and development. He was telling me about failure they had observed in their packaging. Not all my examples are packaging related just these two. But they observed this failure where the seal and the bond were failing spec in a Verification Activity. And they were also having micro tears based on how they were folding the package. And he shared it with me. And I said, Oh, you you had that issue again. And his face got just white and blank. And he said, What do you mean again? And I said, Well, you know, we had the same issue. Six, seven years ago, when we were there. Now, it didn't cross his desk, because he didn't work with it as closely. But it was the same issue identically that we'd had years prior, but the piece that failed, so the procedures were updated. And the test data had had been recorded. But the specs had migrated over time through design changes. And the one piece that wasn't there was that tribal knowledge, all the people who had worked on it had since moved on to other companies. And identically the same failure had occurred again, which was kind of kind of striking to me with all of the procedures and all of the work instructions and all of the best efforts that everyone has, in these companies. Sometimes one of the best tools, just like with the pilots is that shared knowledge. You know, we always say, if you don't document it, it doesn't exist. But even with all of those efforts, you have to know to go look for it. You have to know to do that diligence. And if you don't, it comes back to bite you just just in the case that they did that cost them some time and money to end up at the same place where we'd started years prior.

Aaron Harmon:

So one thing that I really clung on to in this episode was cockpit coordination classes. And he talked about how your, your flying and you can have an ego getting in the way of making good decisions. You know, Diane, Joe, have you ever seen scientists, engineers, managers make mistakes because they let their ego get in the way?

Unknown:

I'll jump in with one quick one. But then I'd like to hear one of yours, Diane? Absolutely. Aaron, it's a it's an easy way, it's an easy thing to happen. One of the one of the examples I can give as I was working on a device that had a had a heat signature to it. And we want to focus heat to a specific location at a specific depth. And we helped mitigate this with a cooling component to the system. In early early testing on the bench in a simulated gel, the job was to simulate human tissue, we would see these heat bands on the sides of the device. It was an electro electrical component that did the heating. You kind of had these these peaks on either side of this electrical component and then the gradient heat that we wanted in the middle. And so an early solution which with the end of three that we tested on the bench was a great idea I would I agreed with it was to put These bands on either side of where those heat signature peaks were on either end, to allow the electricity to not conduct through their these dielectric strips to stop that from happening, and basically let the electrical current flow where we wanted it to to have that nice gradient heat signature. So we put those on, and everything was great. We tested it the gel, once more, and it looked like we wanted it to look. And then we proceeded on. But a lot of factors happen. And we talked, as we talked about earlier, there's all those tiny little minute decisions that add up to big decisions. Fortunately, this was a device that was not commercialized, there were a handful of people in the clinical trial that were affected by this. But unfortunately, so. But what we're finding was those dielectric strips to help the heat signature were in this device that was getting retracted would peel off. And we were having significant issues trying to keep them on and keep them in the place that that we thought that they needed to be. The devices had changed, migrates over time, lots of little design changes add up. And also, these heat dielectric strips were challenging to keep on. But what we later found with a better model, but also we found we had a better model, because of what we found in the patients was they were actually not doing anything. They were completely ineffective, and they weren't even needed. And so we were starting to formulate this data. And we had multiple functions kind of weighing in on this. And we have the science department who was analyzing the heat signature, we couldn't analyze it in in the procedure. But we could see that the effects of it afterward in the in the patient. And we were starting to observe this and starting to figure out that our design needed to be modified to bring in the amount of the electrical component that was that was providing the heat and the energy. And what we were seeing were these peaks wasn't due to anything but other than the length of the electrical component. That was all that it was. And so even with without those strips, when they had been removed, whether they were there or not, they were still providing the same the same marks from too much heat being generated as long winded story, but simply to say that this one individual who had devised the idea was vehemently opposed to removing them, we needed to keep them and actively worked and actively sabotage anyone who tried to convince us that we needed to take them off. We spent hours and months collecting data, trying to test on the bench and finding a more appropriate model. We were using pork chops, and pork tenderloin to test on and get a heating structure all to put together this, this report to help convince ourselves across functionally everyone, including all the way up through senior management, everyone was in agreement that this needed to be root removed, except the one engineer who devised the idea in the first place, I still stay in touch with him. And he's a very fine engineer. But with respect to this, this one piece, his ego about keeping it really inhibited our ability to develop and find the solution. You know, oftentimes in this, in this industry, we say fail fast, fail early and fail often because you learn from your failures. You know, as I always say, nobody ever remembers the exam questions they got, right, they remember the exam questions they got wrong, because it provides an opportunity for learning. You know, and same with with device development, we learn from our failures, failures aren't bad. It's what you do with the learning after the fact. And this was an example of, we learned that they were bad. And we spent a lot of time and energy, not finding the better solution, but finding ways to convince him that it was what we needed to give up. And it cost the company a significant amount of time and energy we eventually got there with all of you know, with most people in agreement, save one. But nevertheless, it was it was an example of of ego getting in the way.

Aaron Harmon:

So I was at one point in r&d lab, and I was mixing the serum samples. So we had collected large amounts of serum from some animals, we're making blends of these. And I had them broken into four groups. And one group was a control group. And then I had a two three and four. I was doing the mixtures and while I was supposed to be pooling everything from group one pooling everything from group two, etc. I was pouring and I realized the bottle I was pouring was the wrong one I actually blended in one of the wrong pools of animals into this into this group. So I had them more like shoot, you know, what are the implications of this what I just mess up. And about 2030 minutes later I was back in our cubicle area talking with some other scientists and I was kind of complaining I made this mistake One of the persons on our team said, I saw you doing that I thought it was really strange that you would mix that wrong one in there. So why didn't you say something? And she's like, Well, I figured you knew what you were doing. So I just thought you had some plan, you were just doing it. And I was like, he shouldn't ever trust me at all. by rail, it was kind of impactful, because someone saw me making a mistake, and just had this assumption that I must know what I'm doing and have a plan and couldn't possibly mean making an error. And like, yeah, obviously, you don't know me well enough.

Diane Cox:

It's amazing what. And when you do things with conviction, you do it with confidence, you're maybe, you know, perceived as an expert in the area, how things just kind of can just be let be because of just the circumstances. So Joe,

Aaron Harmon:

I mentioned design control early on, can you maybe give a real high level overview of what design control is for people that may not be familiar with that?

Unknown:

Sure. Yeah. So design control is is a series of steps. We've kind of been talking around it. Simply design controls as a process to ensure that you are building what you say you're going to build, and that it is going to do what you say it's going to do. Diane mentioned earlier, user needs, FDA has got a very good procedure around this that was written in the 90s, early 90s, that is still very relevant today, and is still being used by everyone is really kind of a gold standard. And they stole a graphic from Health Canada, design control is so important that it doesn't matter where you are, in the design and development process in the world. Everybody kind of does the same thing. Because it's a system that works and has feedback loops in it. You start off with saying there's this unmet need within it could be anything, it could be devices, it could be pharmaceuticals, it could be in vitro diagnostics, biologics, doesn't really matter. But there's this unmet need, and what is that unmet need? And what does that what would the solution look like? So this is your user needs document or your your user needs analysis, or your market requirements, document? Whatever, however, you want to scope out what you need to fill this unmet need? What's the solution? And what is look like? What does it feel like? What is it? How does it operate. And from there, it feeds into a design specification document, which will tell you or product specification document is another way of describing it. But it tells you it's this big, it weighs this much it has these dimensions and these orientations and these marks, and etc, it needs to have a shelf life of this long. That's more of a market requirements document. But it's something that can also fit into the product spec or design document, just saying that it has to have a shelf life. So all of these things, these detailed outlines, describe what it is, and then you verify it. So you say what we say we're going to build, we can actually build design inputs, design outputs, it's called your Verification Activity is confirming that your outputs of what you actually are building is what you say you are going to build in your design specification document your product specification document, and then you perform validation activities, design validation, which confirm. So those are outputs of what your market inputs are your user specification document, your user needs, document, your your marketing specification, your validation activities, confirm that, it's actually going to do this and meet this unmet need that you set it needs to do up front. And what's nice about the design control. So you follow through this process, you verify it, you validate it, okay, so it's not only what we said it needs to be, but it is built the way it's supposed to be we it's reproducible. At any point it is not it feeds back in the loop feeds back into your original user needs document, or your product specifications document where you can correct it. And this is a living thing that follows this cycle every time you get a complaint that comes in or you have you determine your risks. And the complaint needs says that you have to reevaluate your risks that can feed back again into your market. Market needs document or user needs document, your market marketing specification and your design specification to ensure that you're updating it and amending it and revising it as appropriate in this flow.

Aaron Harmon:

So using the airplane as an example, then they knew they had to get from one city to another that would have been like a user need and to do that they needed a certain amount of fuel, and then there would be a test to ensure they had had the right amount of fuel, and then maybe some verification as well, or in process verification during the flight where they'd be watching fuel levels, and getting alarms to monitor the progress. And then that allowed them to deliver customers to the next destination.

Diane Cox:

One thing that Joe mentioned was this risk analysis thing that feeds so nicely into design controls. And this is kind of where I think this, this issue is very connected to the design control world is through risk management. I mean, had they kind of played out this scenario, even just kind of thinking, Hey, were along this process of all these sequence of events that happen when you're taking off in the air, and then all the way through landing all these various events that could potentially happen, you get a bunch of people, smart people in a room and you go through all these, even, you know, what might seem to be very crazy ideas of what could happen, jot them down and you understand what's the probability of this happening? What happens what from a severity standpoint, if this does happen, and you figure out which ones of those risks you need to put mitigation controls in place for? And I feel like no, you know, everything, some things get missed along the way. Not everything is a perfect process. But I feel like had they kind of gone through this type of a scenario, this may have come up during a risk analysis, and then they would have had some controls in place to prevent such a thing from happening, rather than having to deal with it after the fact. But you know, hindsight is 2020, of course, right? It's always

Unknown:

hard to that point. And it's always hard to you're designing something for the first time, risk is so important, right? And assessing that risk and determining those probability occurrence and severity. But it's really hard if you've never done it before, right? And imagine all the potential scenarios that something could go wrong. And I almost believe that their system worked because they they changed their procedures after the fact. And they somebody went along and added this risk to their risk assessment. But we're probably wondering, and it's a little bit Monday morning, Monday morning quarterbacking here, but probably wondering, why wasn't this included? upfront? What do we do and when a gauge doesn't work? It seems kind of obvious. But it's very clearly could have been a risk that wasn't identified and what the failures are, and then severity and occurrence and outcome could be

Aaron Harmon:

Yeah, also, with the hindsight, I've no idea where aviation was coming from 1883. To now. Yeah. But you could you could work in something where of circuit breakers are turned off, you can't turn on the plane, I mean, you can integrate some kind of failsafe to force the problem remedied instead of allow people to bypass and move on.

Diane Cox:

But but to that point, I mean, now that is knowledge that now that's a documented situation. And in the medical device world, we use that information of other people's devices, competitor analysis to look at things that may have happened to their products that are similar to ours. The FDA, for example, has databases that you can search through for all sorts of adverse events, field actions, recalls, things like that, that you can kind of get a peek at what has happened to other products that maybe I didn't think about yet on my product, you might want to look at similar products, the exact same kind of product, even things with man the same materials, and that that knowledge is so so beneficial to have at our fingertips. And even if you're a company who has established products, if you're thinking or taking a look back in your complaint system, and feeding that back into your new design process that that goes a long way into preventing things from happening over and over again.

Unknown:

Yeah, and I could just jump in and tell you if you forget to look at the MDR database from FDA and see what other complaints your competitors or potential competitors have had. FDA will point it out to you. I've had several submissions that I've sent in to FDA. So I've tried doing my diligence have looked at the mod database and have looked at other reportable events that competitive devices have had and what the outcomes were, and try to mitigate that proactively in my submissions. But if I forgot to do that FDA would be quick to point it out. But I've had several instances where they have asked me for how we have addressed a failure that didn't occur with our test data didn't occur in our test data that was not in our in our submission, but was clearly from a competitive device where they saw that failure on the bench and had addressed it that was ultimately later led to what I believe led to a reportable event seen on the mod database. And so you know, every time you put in a submission, you sage regulatory professional will kind of know their areas of weaknesses and will already start working on a rationale for how to address them when that question comes up. You know, I kind of attribute 30% of any amount of a submission to have those anticipated areas of concern or question from from regulatory body. And then there's there's always that 30% that you just, you can't anticipate that they're going to ask because that's just a hot button topic for them at that moment. And of those, there's also another component, which is the thing that you just didn't expect, because it happened to competitive ice that you have no awareness of. It's not only happened with submissions that I've sent to FDA, but it's also happened with notified bodies, especially now, how the regulations have changed in Europe, having fewer notified bodies and having a centralized system for, for admin adverse event reporting and vigilance, there is more of an opportunity to learn from from competitors, failures or observations in that space as well, that's only going to help us I think, in all of our submissions, worldwide,

Aaron Harmon:

information sharing is so beneficial and so available, and that people need to take advantage of it.

Unknown:

And and needed. I mean, it changed in Europe because the regulators felt this is this is needed, and has to happen for the betterment of all devices, and the patients ultimately.

Aaron Harmon:

So other thoughts? I think this is such a deep topic that you can go down any of these into just its own little world of You're really good.

Unknown:

Yeah, I think each one of them can be its own podcast, to give you ripe opportunities for future. Podcasts. I've

Aaron Harmon:

been harassing Diane about season two. Season three queued up. And first of all, thanks for being on the show, Joe.

Unknown:

Yeah, thank you very much for having me. I really appreciate it. It's been a pleasure to be here. Thank you, Anna. Thank you, Dan.

Aaron Harmon:

I want listeners to know so Joe works with me and the company I work for. And every time we get on the phone to kind of discuss the project, it's like so easy for us to go down little rabbit holes of things about regulatory and quality. And, Joe, you're such a good resource for those things, you have a lot of knowledge and bring a lot of expertise to this field.

Diane Cox:

I agree completely.

Unknown:

I appreciate that. From both of you. Thank you I learned from both of you equally as much Aaron, you are so intelligent with regulations I what I once knew and have since forgotten, after a conversation with you that not only inspires me to go back and relearn, but you've you've taught me new ways to think about things and Diane you, I really enjoy working with you you're you're elegant solutions to something that could otherwise be so mundane. A quality system could be so mundane. But it's a it's takes a new a new angle to make it an elegant solution to what otherwise could be a complex problem. And keep it simple because we all we're all people, we have to work through it. And that's what I enjoy. I guess working with both of you is you're not afraid to ask the questions and and challenge each other. And I learned from that every time we interact, and I appreciate it. So thank you both.

Diane Cox:

Appreciate that.

Aaron Harmon:

Thank you, Joe. Joe, do you mind if I put your email contact into our show notes for listeners. So if they want to reach out to you they have an easy way to find you.

Unknown:

Yeah, that would be great. I appreciate that. Thank you.

Aaron Harmon:

I will do that. Thanks again. And we also like to thank all of our listeners for tuning in. And we're looking forward to bring you the next episode.

Diane Cox:

We hope you enjoyed this episode. This was brought to you thanks to South Dakota biotech Association. If you have a story you'd like us to explore in share, let us know by visiting www.sd bio.org.

Aaron Harmon:

Other resources for quality include the University of South Dakota's biomedical engineering department where you can find courses on quality systems, regulatory affairs, and medical product development. Also, if you live in the Sioux Falls area, check out quit a local Quality Assurance Professionals Network. You can find out more about pivot by clicking on the link on our website to the end and I would like to thank several people, but a few who stand out are Nate peple for a support with audio mixing Barbara Durrell Christian for support with graphics design and web. And lastly, the support from South Dakota bio