The Security Circle

EP 060 Julian Talbot Author of ‘Tales from the Edge’ Discusses Risk v Decision Making and the Anatomy of Disasters

Julian Talbot Season 1 Episode 60

Send us a text

Julian has 35 years international experience up to and including Director level and C-Suite in the commercial, government, and not-for-profit sectors internationally.
 
Julian’s operational experience includes seven years with an operational unit of the Australian firefighter. Strategic leadership roles include 30 years’ experience as a company director of commercial corporations including the ASX listed Citadel Group Limited, as well as 20 years’ experience as a Director of not-for-profit organizations such as the Risk Management Institution of Australasia (RMIA), Australasian Institute of Professional Intelligence Officers (AIPIO), the Washington DC based Security and Risk Management Association (SARMA).

Julian is internationally recognized for his business continuity and risk management expertise. His experience includes roles in Africa, Australia, Asia, Europe, and North America. He was awarded the Australian Security Medal for Conspicuous Service for his service to the security profession in Australia. Julian is one of only five people accredited by the Security Professionals

Registry of Australasia.
The SRMBOK newsletter and resources at https://www.srmbok.com/ are probably the main link to share but also a few more in order of priority:

Security Circle ⭕️ is an IFPOD production for IFPO the International Foundation of Protection Officers

If you enjoy the security circle podcast, please like share and comment or even better. Leave us a Fabry view. We can be found on all podcast platforms. Be sure to subscribe. The security circle every Thursday. We love Thursdays.

Yoyo:

Hi. This is Yolanda. Welcome. Welcome to the Security Circle podcast. Now, IFPO is the International Foundation for Protection Officers, and we are committed to providing meaningful education and certification for all levels of security personnel, and make a positive difference to our members mental health and well being. And we want to thank all of our listeners around the world. Thank you for listening. I'm being a part of this. So with me today is a very special guest. He's from Canberra in Australia. Some of you will know him from being a top notch author. His name is Julian Talbot, and we're going to be talking about his book today, Tales from the Edge. Julian, welcome to the Security Circle Podcast.

Julian:

Hi, Yolanda. Thank you. It's lovely to be here in opposite ends of the world, but it's it's like being in the same room with modern technology. That's

Yoyo:

why you look all rosy cheeked and I've got a croaky voice because it's your afternoon and my very early morning.

Julian:

And it's springtime here in this part of the world. Yeah, shut up,

Yoyo:

Jolien.

Julian:

Oh, we're off to a bad start already. Sorry about that. Yeah,

Yoyo:

We do like to talk about the weather in England, as we have been talking about it already. Julian, look, your book's phenomenal. A lot of security professionals love reading about real life, risk management stories. There's a lot to talk about with you because this isn't your only book. You've also published a book called security risk management aid memoir, which has got lots of tips and tricks in it around, lots of matrices. And I mean, it's phenomenal. It's like a, it's a junior. professional guide to excellence, really. Why did you write a book about real life risk management stories? Let's

Julian:

start there. I've written eight books now. They all start with a vague, ill formed idea in a cafe. And this one, just, I was thinking about how I communicate risk on the one hand, where you can write books about theories and diagrams and models, and they're great. The security risk management aid memoir is full of color diagrams and like this, do that. But when I thought about how I really teach risk, when I'm talking to people or giving a presentation or running a training course, I always keep going back to the stories. And there's a reason for that because we are hardwired for stories. We learn what happened to somebody else in a way that connects with us and that no amount of models and abstract text can do. So it grew out of that. I thought, I'll just write a story for a. Another one of my blog articles, and that turned into another story. There's a book here. So that before I knew it went from a very thin book to about 300 pages of risk stories.

Yoyo:

You are a very passionate writer, actually. I think writing is one of your primary languages. Would you say that you communicate most effectively through word?

Julian:

I think so. I like to think that I communicate reasonably well in person, but it's that the default that I go to quite often is, I remember sitting down on holiday at one point to write a couple of notes. And in four days, I wrote 20, 000 words of a book on travel safety. It's just. And Richard Bach summed it up well when someone said, do you love writing? And he said, no, I can't stand it. But he said, but you've written 20 books. And he said, the only time I write a book is when there's an explosion of dust and rubble comes through my study wall. An idea grabs him by the throat and drags me to the keyboard. And I can relate to that. When there's an idea in your head at four o'clock in the morning, it's I think I need to get up now and just start writing. And then 5, 000 words later, It's, it's half formed. That's a

Yoyo:

compulsion, isn't it? that's like a talent as well. Whether, you do it with passion or not, the fact that you're getting that call at four o'clock in the morning and your brain's waking up going, ah, gotta get this out, gotta get it out, gotta get it out. Got to start typing, so that was going to be my next question because I produce another podcast as well called Turning Pages where we will be featuring your book and I often ask authors, what was the process like for writing? Was it hard work? And even Dr. David Rubens says, it was the book that he wrote. It wasn't the book he wanted to write. Some people find it incredibly hard. So do we have more, more coming from

Julian:

you yeah, I've got four currently, three back from editors, one that I need to finish off to publish probably in the next few months, and then another one going to the editor, maybe in January. And they just, they write themselves, like I'm a first draft kind of guy, some authors love the polishing. That's not me, I just go whoosh and there's a first draft and everything after that is a little bit of work until eventually I get to the point where, how do I put this nicely? It's just got to go out to the world. I can't edit it again. And I've never yet published a book that I'm completely happy with. That's the nature of writing really.

Yoyo:

I would describe you as a seasoned security professional looking at your career and some of you will be looking you up on LinkedIn as I have been, you've held several very senior leadership positions, haven't you in security? So you're coming from a position of strength, you're sharing your expertise. So look, let's talk about. Tales from the Edge. And I think there's a phrase, isn't there? There's a lot to learn from a crisis. Nobody wastes a crisis. It's okay to say that once the crisis is over, but no one really wants to be sitting there in a kind of courtyard audience saying, Oh, we're watching this roll out. And it's a lot of pain for other people. A lot of the crises you talk about and a lot of the crises that it, I like to talk about did tragically involve real and significant, threat to life and casualties. There is still, nevertheless, a lot to learn. Let's start from, you used a phrase called the Titanic effect. What does that mean? Take us through the Titanic story.

Julian:

So the titanic effect refers to this idea that the more you believe a risk to be impossible, the more likely you are to actually manifest that risk. And it goes back to this concept, the titanic was unsinkable. Therefore, they didn't need to have lifeboats for every passenger on board. They didn't need to have enough lifeboats for all the crew and passengers. Therefore, they could pursue at speed across the maiden voyage. Knowing that even if they hit, believing, I should say, not knowing, even if they grazed an iceberg, they were unsinkable. Therefore, it didn't matter. So that the actual idea that they were unsinkable, it actually contributed not only to the net effect, quite frankly, if you don't have enough lifeboats, you are relying completely on it being unsinkable. And if you're traveling at speed, you're trying to set a maiden voyage. You've got a reputation to build with the fastest crossing on this first maiden voyage. You do a whole lot of things that make that risk more likely to happen because you don't believe it can happen. for example, if you think about, if you believe your car has all the driverless controls and You know, it won't veer across lanes because it's got that programmed in and you don't need to wear a seatbelt and you can spend the time tuning the radio and what have you, you're actually increasing the likelihood and so many corporations do that. They have this. idea of the, and I put, I like Nassim Taleb's book, Black Swan, but the Black Swan idea is often taken by a lot of people to mean, oh, sorry, that was something we couldn't foresee. So it's like this get out of jail free card of, no, no boss, we couldn't have seen that coming. That's impossible. That was a Black Swan. Therefore, we had no duty to prevent it and the titanic effect just speaks to that idea.

Yoyo:

So black swan, then the definition is that, it's a metaphor that describes an event that comes as a surprise, has a major effect and is often inappropriately rationalized after the fact. I think, A lot of people describe things as black swans, but they're not really black swans, are they? Because let's face it, we look at Titanic and can we say as professionals that was a black swan? No, because we could all see with hindsight, all of the issues that would have made that inevitable had they struck an iceberg. What's your view about. But the true meaning of a black swan and I mean a lot of people say, Twin Towers and 9 11 was a black swan. And it is to a degree. However, it wasn't uncommon for aircraft to hit tall buildings. One hit a building here in Canary Wharf. There are several others. There was another incident of somebody going straight into the towers and other instances around the world. So one would say that's not really an unexpected thing that could happen. What's your

Julian:

view? Oh, I totally agree. It's the, we know in 1936 there was a, was it a big 19 flew into the side of the Empire State Building, so we've already had. Buildings damaged by aircraft in New York City. We know we've had aircraft hijacked. We know we've had suicide bombers. We've known Al Qaeda had made enormous threats against the United States. And I don't like to say this in hindsight, but there was evidence, that through the intelligence that things were going on. Which had they been put together at the time would have made sense. we can't predict which building. We shouldn't try to predict that something's going to happen, but we need this all hazards mindset. And we need to think with security from the, the point of view of a consequence management, what are the vulnerabilities? How do we think about it? And the irony of security is that if in the year 2000 some congressman insisted that putting an act of parliament or act of, congress through that all aircraft cockpit doors had to be hardened and armored, and no access permitted under any circumstances, they probably would have been pillarized when drawn and courted for the cost to the airline industry and the media wouldn't have told about a potential 3000 lives saved, they would have spoken about the cost of the airline industry and to the traveling public. So security is very much a little bit like that in that context, there's three laws that I talk about three different types of risk. So the actuarial risk where you can calculate the known numbers, how many house buyers will be in the city of London, how many car accidents will be on a given stretch of motorway, we know fairly well what they are. And then you've got sort of adaptive risk where your market forces, where your competitors are trying to out compete you. But they're within the law of the land. So the actuarial is the law of large numbers. Adaptive risk is the law of the land where they're trying to out compete you. Maybe playing football, for example, is a classic of that. But security works in this other paradigm, which is the law of the jungle. Adversarial risk. Where the other guy doesn't win unless they take you down, so it's not enough for us to say, oh, I didn't see that possibility, or I didn't prepare for it, or I didn't have enough firefighters, or whatever it is, because we need that all hazards mindset with security,

Yoyo:

You know, I'm into sci fi, and I, this thing about time travel always excites me, Julian, I think, it'd be nice to go back in time. I'm thinking, after delving into risk management, and in particular, my favorite crises to study are primarily airline crashes, Kings Cross fire. there's Katrina as well, which is something that really piques my interest and 9 11. And you think, be so cool and look at Zubriga, for example they just didn't shut the bloody door. and it sounds like an awful thing, but I would hate to go back in time now, because when I, as a crisis manager, when you look back and see how reckless. organizations and institutions were with our lives. For example, classic example, smoking was banned on tubes, before the King's Cross fire, but it wasn't banned everywhere else within the tube station. And we all know that particular fire started as somebody was lighting a cigarette on the way up out of the tube station on the escalators and the loose match went through the escalator and into a pile of rubbish underneath the escalator. So we all know that and we look back now and think how on earth could we even justify smoking in a tube let alone underground, how reckless were we with our own lives and we were allowed to be.

Julian:

And wasn't that long ago that we had smoking on aircraft? When you just think about how ridiculously dangerous that is.

Yoyo:

everything smelled bad on a plane as well. and yeah, and even in cinemas,, I remember the mindset of, I'm hoping. Now, when I get on a plane, I'm like, I'm hoping I don't sit next to a screaming child. But back then it used to be, I hope I'm not going to sit next to someone who's going to chain smoke. Because you could tolerate the odd bit because we'd all become tolerant of it. But if you were near a chain smoker, that was just unpleasant. Things

Julian:

have changed now. And the irony of it is, in this context, so much of what we take for granted, and by the way, when you talk about how they put our lives at risk decades ago, every time I looked at one of these, they're still doing it. We are still doing it. But we're doing it in a paradigm like, the smoking industry kicked up all sorts of complaints when they were going to ban smoking in pubs. Right. The theory was, the publicans association rallied around, we're going to go bankrupt. you're going to force us out of business. And lo and behold, so many people said, Hey, I'd happily go to a pub now that I don't have to breathe secondhand smoke and businesses boomed. So these little twists of logic that we tend to believe and take for granted.

Yoyo:

I was a smoker in one part of my life, I mean, there was a history of telling everyone that smoking was cool and it was good for you. I mean, if you go way back to the 40s and 50s and how it was glamorized and how we can be persuaded as a nation or nations through movies and, every time you see Betty Davis in a movie, she's got a fag in her hand., Because back then she was told to, it was the coolest thing ever. I don't know how we could ever look back now and say that was a smart thing, when you think how unhealthy it must be for lungs, but it's not the only time is it that humans have. made shortcuts and money has been the key underlying factor,

Julian:

Julian. Look at the pandemic we recently went through and all the disinformation and misinformation, the dubious procurement practices that went on there, The health guidelines that in some cases are still out of date and against known logic, here in Australia, for example, just to pick a simple example, up until about 2022, we were being told that the vaccine, stayed localized. It couldn't be the cause of blood clots because it didn't go anywhere else in the body. But it turns out about 10 months earlier than they finally changed their mindset, as recently as February of 2021, our Therapeutic Goods Association actually had documents from Pfizer showing that the vaccine spread all through the body to the brain, the liver, ovaries. And so this is, our belief in the authorities and our willingness to, not our willingness perhaps, but The amount of disinformation that comes not just from malevolent actors, but also from, how to put this politely, but, let's just say disingenuous or negligent authority figures. And I was a senior risk advisor for our Department of Health and Aging in the 2003 pandemic, the SARS, and I saw so many of the behaviors there. Which were repeated and, I don't want to make the podcast about that, but I think the point is you really have to do your own research and you have to think about it from a risk point of view. a lab trial, for example, about a new pharmaceutical or whatever it might be, is taken in some circles as being a risk assessment. So the risk factors are yada, yada, yada. But as a risk practitioner, I look and say, no, we don't use pharmaceuticals in the community the same way lab rats do in a test environment. People do all sorts of things. They have, comorbidities, they take. alcohol, they take other drugs, they, have lifestyle choices, to do a risk assess, but you need to think about how does it work in the real world? and this is where so many of these tails and tails from the edge to start to lay it out. That's, it's always the human factor. always comes back to the human.

Yoyo:

Okay. So putting the vaccine to the side, because we know that's a hugely contentious and very divisive subject. We can safely say as professionals that Obama, for example, said that he felt that we weren't prepared for a pandemic. And this was in 2014. That's a quote cited by Dr. David Rubens in his book. Strategic risk and crisis management. And I think every single security professional would look back and say, we've always been called harbingers of doom. The person that sort of turned around and spoke to Obama's predecessor and said, Hey, listen, we need to carry on with this because it could happen. And you can imagine, nah, sorry, low risk. and again, there are decisions being made that affect our lives and it's worrying. So I'm thinking, look. We agree, and it doesn't matter how many, investigations there are globally around, COVID. We agree that everybody was unprepared. Not everybody. Not everybody. We'll come on to that in a second. But a lot of countries were unprepared because they chose to consider it as a low risk impact. Or low likelihood of happening, even though professionals would say different, because I think we had a lot to learn from mad cow disease. And you've certainly got a lot of lessons learned for SARS. And that's not the only time as well, but COVID was the first time we could see something rolling out like a movie, an outbreak. And I think we were very lucky as a human race that this, This covid virus, we were lucky that it had a long kind of incubation period. We were lucky that, people were developing immunity naturally low percentages, though, and we were lucky that. People weren't dying within, 24 hours, which has been sort of widely televised with other types of viruses like Ebola, for example. I was talking to somebody just recently, pre-vaccine and she reminded me that, we all felt rather desperate, pre-vaccine to need something safe and a cure so that we could carry on living our lives as normal, because we were then in lockdown with no vaccine, but a lot of pressure for a vaccine. I think it's important that we remember those times. We can all remember the times and we all, we know that the vaccine has a huge pro and anti rhetoric. Which is why it's important not to go into that. But pre vaccine, we didn't know what was going to happen. And we didn't know what the percentage, if you had an underlying health condition, this virus was going to dig it out and you were going to realize you had it, if you didn't know you had it before, that's my opinion. It really did target the weak and I haven't been through COVID. I have never had it. So let's go back to the time before pre vaccine. There were some knee jerk reaction decisions made, some of them too late, with hindsight, we know. But you and I both know, Julian, even though there was a lot of pressure, for example, to shut down the airports, you can't just do that. You've got thousands, you've got a million people in the air at any one time. And you can't stop people coming home because that's the biggest risk, isn't it? So let's talk about those early days. What would you like to say about it?

Julian:

Gee, where to start so interestingly enough, the countries that were prepared were places like Singapore, so that sort of part of Asia, region of Asia, where basically they had gone through, the SARS much more significantly than we had. And Singapore is the epidemic, they had set up clinics. Based around this whole idea of not having to congregate people into hospitals. So firstly, you've got to say, so why weren't we prepared? So there's two elements to that. Firstly, there's only a finite bucket of money, and there's a whole lot of strategic things, which has to be spent on healthcare, infrastructure, all sorts of things. So if you take money away from roads and hospitals and healthcare and education and schools to put it into a pandemic preparedness, that's a difficult decision. It's made even harder by the three year electoral cycle. Because you don't want to be the person who took money away from, what's perceived to be a public good and, wasted it, quote, unquote, on something that didn't happen. then you've also got to remember that our politicians and our public servants and what have you, although most of them want to do the right thing, they all at some level have to manage their personal career risk. So standing up on a soapbox and saying, we need to do more of this. can come at a personal cost for them. So you've got to put that into it. If I was to take some of the elements that I saw firsthand, from the SARS and from indeed prior to this, firstly, we don't, nobody in the world, and I've looked, I haven't been able to find a solid risk assessment. And by that, an ISO 31000, whole of life risk assessment, working it through as a process, using a bow tie or using any sort of these types of tools. so I can see all sorts of inputs and one day perhaps I'll write that book because I, in my mind, I know exactly where I want to go with it and do a thorough risk assessment, how the world really is, how people actually behave. If we had a look at the data we've seen, we've now have good metadata and we can retrospectively now say, we know masks weren't effective. We know there were side effects from vaccines. We really don't know whether the vaccines. Or better or worse. I'm not saying that either way. I'm just saying I haven't done the analysis to be able to determine whether you were better off natural immunity or vaccines. I don't think anybody else has, but there's enough evidence there to know that needs investigating. There's also an evidence that we already had around vitamin D and we missed a couple of huge wins there by not, for example, really simple effect. Just take a blood sample from each person who comes into hospital during the pandemic. we don't have vaccines, but we've got the first cases. Just find out what their serum vitamin D level is and put that into the database. If we'd done that simple trick, we'd have a huge database to be able to say, is vitamin D, the evidence suggests it is. We've got good evidence now, but we would have an enormous amount of evidence for very little cost.

Yoyo:

I remember as well, which country made the connection with vitamin D and that was Indonesia.

Julian:

your chap, Dr. John Campbell has been publicizing that and a whole lot of other research coming. Yeah. What

Yoyo:

a great, he's still going now, isn't he, John Campbell? He hasn't said anything yet that I've really disagreed with and he's stayed very neutral. And for those of you that don't know Dr. John Campbell, he has a history in the medical profession. And he produced during COVID a sort of YouTube episode every day focusing on global trends and stats and figures and his listenership were just open to just having sensible information, weren't

Julian:

they? Yep. And he's fabulous. I love that. He's very measured and very British, very gentle, but evidence based. So he doesn't dwell in controversy. He just presents the evidence. And it's been almost a little sad to see from this very enthusiastic allopathic medicine, you must get vaccinated. You must have masks. You must trust the authorities. You must, you can count on what they say. And I've just watched this transition over three years to his fundamental disappointment, which I have to say, I share. In our health advisors in our health policy and just the blaring gaps and he's very diplomatic about it and he doesn't voice his opinion, but he shares the evidence and, and the more he unfolds the evidence, you can see, I honestly feel sorry I'm living his whole life in the allopathic. Not in the allopathic scheme, but in this evidence based scheme where we relied on what the national authorities were saying. Only to find out that they're dubious..

Yoyo:

Let's look at the general public. And I'm going to refer to the British general public in the main here, but there are variances in different degrees. It's pretty hard to tell the general public to do anything for their own good. you get this right, and I'm going back to the days when we used to do fire tests in tall buildings in London. And you'd always get that percentage of people that are like, you can't tell me what to do. If I'm going to go down to the basement, I'm going to go down to the basement, get my bike, and I'm going to go home now. Who are you to tell me what to do? There's this minority group that will not conform to the general what is safe for everybody. And let's take that scenario and blow it up into COVID. And then you've got a group of people that are like, I'm sorry, no one's going to tell me I'm going to stay indoors. sorry, I'm going to do my thing and that's what I'm going to do. this is a whole load of rubbish, blah, blah, blah. And so it is hard. And then this whole scenario, the whole crisis around COVID brought out quite a lot of anti establishmentarianism. It's not often that I can get that word into a podcast, but also you could see a huge amount of conformity as well. A lot of people would just do the right thing and just stay in and not harm anybody else. And I had elderly friends, they were going around chatting to each other in each other's gardens, at least two meters apart, and they were all being very safe for each other. And so you've got these kind of polarizing opposites of behaviors. And I'm thinking, but then the minute you don't tell someone to do something, need to be told, we need to be told what to do. And I think That those people that conform, there's a certain percentage of them that need to be told to do that. Not everybody can be relied on to do the right thing instinctively or with the right common sense. So the government, I think, and let's just be very pragmatic now, the government, I think, even if they've made mistakes, I don't know any government that got it all perfectly right. I have a huge amount of tolerance for the fact that there was unpreparedness because I expected that. But it's really hard for leaders of nations to do the right thing in everybody's eyes because there's always going to be a percentage of the people that say, Oh, so you're telling us to do this. And then the other part will be like, why haven't you told me, do I need to wear a mask? Do I not wear, am I wearing a mask here? Do I not need, I don't think just use common sense and just wear one, so it is, it's the devil of jobs in a crisis for leaders. To be that resonating sound of reason, isn't it? And they will face criticism either way. I think that's the best way to phrase that.

Julian:

I think you're right about the point that most people, a lot of people don't want to be told. I think there are a minority, most people certainly in the pandemic. did what they believed to be the best, they follow the advice as best they could and as they understood it. You can tell people to stop smoking, to get a good night's rest, exercise, to eat well, and a certain population will say, I don't care if it's going to kill me, I'm going to continue smoking, right? so whatever that happens, you won't. I think my concern with the pandemic area now is that Because there's a lot of evidence coming out and we're not seeing a lot of, how to put this politely, there's not much ownership of, oh, yes, sorry, we made a mistake, or yes, we were wrong, or, we didn't, we had that information, but it didn't make it through to our policy. The wheels and machinery of government turn too slowly. So I'm concerned that when we do have the next pandemic, which we will, or indeed any sort of crisis, that trust in our governments is really at an all time low. I see all this unfolding with the CDC and WHO and and internally contradictory material. So that's a real risk, I think, at a societal level that we erode trust at our own peril, particularly in time of deep fakes and AI, populism on the rise, geostrategic tensions globally at a peak. And here we are, in some respects, making things worse, increasing our risk profile.

Yoyo:

It's interesting, actually, the kind of morbidity around what the largest downloaded movie was at the very beginning time, the first three months, February, March and April, May for COVID. And that was movies like Outbreak and the other one with, Jude Law and Kate Winslet. contagion there are things to learn watching those movies as a risk professional there are things to learn about contagion and how, good hand hygiene, for example, can prevent. The spread. And I think that was the key message in those movies. But there's also a part of me and I never wanted this to happen. But because people weren't bleeding from their eyes and had lesions all over their faces, there was this kind of disbelief that this COVID. was actually happening. I always believed that it was happening. So when it only presented initially as a sore throat and, it's only when people were struggling to breathe. That they thought, this shit's real, now I can't breathe, now I'm panicking, I still can't breathe properly, now I'm panicking more, my heart rate's going up, I'm now panicking more, and then you get the ambulance called, and so you had that happening on a mass scale of people panicking because they realised this shit was real and they were struggling to breathe. And we've got to remember that no matter how many things went wrong, there was no vaccine for that. And people were becoming seriously ill.

Julian:

I think on the plus side, this pandemic has been preparation for the big pandemic. I look at, so 9 11, the buildings, World Trade Centers went down, the stock market. New York stock exchange closed for three days. Now they could have been up within hours because they had an offsite, hot site ready to go. they didn't go up for three days cause they didn't want the market to crash. But the fact is that they could have been, and they were ready. And the only reason they were ready was because we'd prepared for the Y2K. That was the only reason that they had all that backup so ready to go, there was mission critical elements, but until Y2K, nobody had taken serious concern about the cost of it. Similarly in Jakarta in 2004, when our Australian embassy was bombed, we were fortunate that there were only people outside were killed. And the only reason that there weren't more fatalities in fact, inside the building. was that a few years previously, when Australia was working with East Timor to help achieve independence, some of the Indonesian soldiers took offense to that and they would fire their rifles at night, take potshots through the window of the embassy. So all the windows of that embassy were hardened with ballistic glass. Now they weren't blast resistant, so a couple of them fell in, but by anticipating and experiencing and dealing with one threat, they had inadvertently prepared for the next. And I think that's what's going to save us that this pandemic. as awful as it was in many respects will be seen as a very, probably as a blessing in hindsight. Yeah,

Yoyo:

but Julian, I'm going to challenge that just a little bit, if I may, because I've heard it being said that, by people, and I'm going to use this sort of very loosely, there's no way anyone's going to get away with another lockdown again. So unless people are bleeding from the eyes, Julian, people are not going to get locked down again, unless they believe there is a significant harm to their life. They're not going to follow, because like you said, to your point about lack of trust in the government now., and it's making me think, no matter how good you are as a. A business owner, business leader, when you've lost the trust of your board and you've lost the trust of your employees, it's time to move on. And so there's this realization that I think that's just a natural cycle. You might not be doing anything, in the baseline wrong, but if you've lost the trust and the media do a great job of eroding that, don't they? The media play a huge part in crises. what sort of examples have you got perhaps where we can see how the media have done a great job in supporting crises, but how they've also done a horrendous job in, in, in what

Julian:

they've said. Okay, let me first come back to your point about people not wanting to be locked down. I think we've learned that other things work well when you look at the evidence. lockdowns actually are not effective. What was more effective in a lot of countries was voluntary, social distancing and physical distancing. the other one, the interesting things that's come out in the research, we now know that masks are not effective. Handwashing is effective, not hugely, but it's effective. Masks have a low benefit, if any, largely none. vitamin D is effective. Managing immune systems are effective. so the whole hospital based approach of saying, let's put doctors and nurses and auditors and what have you into that environment. Expose them to all the COVID patients in a single location. Ask them to work 30 hour shifts. Live on hospital food for a couple of weeks, sleep on gurneys in the corridor, give them no, stress management, no meditation, no proper rest, no vitamin D supplementation. Don't give them anything that would support their immune system. I would like to think that we have learned from that for the next time, if and when people's eyeballs start bleeding, so to speak, or it becomes serious. Now as for the media, there, there is a part of me that says media. Is actually one of the biggest risks that we face in the 21st century. Because people will say, and a journalist, bless them, even a well meaning journalist will say, honestly, the media is here to expose the truth. And I have to say you probably believe that when you come out of graduate school, hopefully you do, but after 20 years in the business, you talk to someone who's a seasoned journalist and they will admit that their job is to sell advertising. So their job is to grab eyeballs or ears with, good news doesn't sell. and common news like man dies of diabetes, 600, 000 people die of heart disease that doesn't sell headlines, doesn't sell newspapers or advertising space. The same way a crash with 300 people creates shock and awe. Now, media can be, if you like, a force for good, but you've actually, like anything, and I'm not picking on the media, maybe I'm just a little bit, but everything has two sides, nuclear power has two sides to it. Coal energy has given us all sorts of liberation from drudgery with the industrial revolution, but it comes at a cost as well. I'm partly positive about deep fake news at the moment. Because what it means is that people won't trust the media and they won't trust even videos of, Bruce and Mary talking about this on the floor of the House of Commons or whatever it is. But it means we'll find other systems. And I'll give you a really good example. I've spoken with a friend. So about 20 years ago, there was a lot of litigation in the music industry where we were talking, singers and songwriters, musicians were suing other ones for stealing their riffs, their chords. And it might need to be a couple of bars and they say, that's mine. You need to pay a royalty for it. And there was a big pushback, which said, if we can't use old music, you're going to kill new music, we're going to get, it's going to die. But the courts upheld the copyright laws. And what we've seen since in the last 20 years is an explosion of new music. People are so inventive. So we learn to deal with problems like fake news, with media problems, with risks of distrust of the government. we're clever. We find ways.

Yoyo:

I don't know. I still think, any humans are by nature lazy, and that manifests itself because humans take shortcuts to do things. I think. There are so many examples about how we do the minimum, how people can do the minimum at work, they can do the minimum in life, or whatever, and how governments can do the minimum, and the minimum has a cost attached to it usually, and so does the maximum output. And I think because it's inherent in our nature as humans to take shortcuts, I don't know that Humans will be inquisitive enough in the majority to think, I don't believe that. So I'm going to go make the effort to find something that's telling the truth. I think everyone's been spoon fed a lot by social media and by online news media. And I think it's a case of, we've become accustomed to those algorithms that are learning that they think they know what we want to see. So just because I have opted for Star Trek. On Google, it sends me an awful lot of news clips around Jean Luc Picard and, William Shatner and then I don't want to read it. I'm just like, if I want to read it, I'll go read it. A lot of it's clickbait. Don't like it. I form part of that. Public that is just so tired of clickbait. I'm not even bothered. I don't care if John Luke Picard's got three nipples. I really don't care. I don't want to read it, just using that wildly as a shooting from the hip example. I've just seen the three nipple episode of friends. That's why I referred to that one. But I'm thinking not everybody's going to do that. you said as well that, around fake news, people are just becoming, is that real? Is it not? I'm like, I've got apathy. I'm like, I'm not bothered. I'm not interested. I think human nature's got some very harsh lessons to learn and I, I don't think we're going to enjoy

Julian:

learning them. Yeah. Yeah. There's not a week goes by that I don't see a human being do or say something that just has to be shaking my head saying. When the time comes, I'm going to be on the side of the machines.

Yoyo:

Yeah. Do you know why? Because they'll never be able to find one single bit of evidence of me being mean to a machine anywhere. They'll be like, ah, she's one of us. She always says thanks to Alexa.

Julian:

I'm always Siri too. But you know, equally there's not a week goes by that I don't see something makes you say humans are amazing. we got to the top of the food chain so far. I'm not saying we'll stay there by any means that AI is almost inevitably going to surpass us. We do do some amazingly clever things, and on average, it swings in roundabouts because we seem to pendulum from this sort of state of, laziness and, idiocracy, and we have a naturally programmed tendency towards bad news, it's appropriate in our evolutionary sense, if you think about it, if you're out in the bush and you get, receive two bits of news, I'm talking a hundred thousand years ago. Oh, there's some nice tasty berries over there. Good news. Oh, there's a saber toothed tiger over there. Bad news. The cost of ignoring the good news is fairly low, whereas the cost of ignoring the bad news is existential. Yeah. So the various bits of clickbait and media take advantage of that, but we need to learn to find a way to deal with that. I know it's not really in the space of security, but it always amuses me that Elon Musk's Neuralink, and I'm not saying it's not a great potential, but when you have this aspiration of connecting humans to this AI, to a giant computer, to the internet, and essentially we all become, connected as we want. We all connect into this one huge super brain, this AI, and we have access to all the information in the world, we have all the thoughts, and we can task this AI to do whatever we want. The analogy that sits with me of that is, it's my brain wired in, along with 10, 000 other humans, into this supercomputer. I've got an IQ of, north of a hundred, it's got an IQ of north of a billion. It's akin to a couple of flies landing on my shoulder and saying, Hey, Jules, can you help us find another fresh, nice piece of horse dung? Can we just borrow you and task you to go find that latest piece of horse shit in the paddock, please? And I'd go, yeah, no worries, nothing better to do, let's just, I'm just not seeing it,

Yoyo:

I love that so much. I'm never going to forget that. One of the things that I liked, I'm going to refer to Judith Cook's book now, an accident waiting to happen, which I would recommend for anyone who's interested in learning from crises. This was a chapter five talks about air traffic control, pilot fatigue. And in fact, a lot of, lessons were learned, with pilots, we're talking sort of quite some time ago now. Because the, they changed the rules and I'm not being very articulate, but they changed the rules so that pilots around the world all spoke in English. And they realized where there were a high level or higher than the norm of plane crashes, it was in countries where English wasn't used as pilot language. And they realized also that the other. the common statistic was that it was in countries where there was a very autocratic leadership style. And so junior pilots weren't allowed to challenge the senior pilot. If they saw something irregular, they weren't allowed to say, captain, the wing is on fire. Because it was inherent in their culture. Yeah. and so when they realized that through speaking English they could communicate and the hierarchy could be more accepting of that feedback, the plane crashes stopped happening. And I just think that's such a phenomenal. intellectual learning around human nature, human culture, and how the difference in culture can make a huge difference in how risk policy is applied.

Julian:

Absolutely. And it's, I love the United States Department of Defense, human factors analysis classification system. It's HVAC. It's a taxonomy of risk events based around some research they did basically in the late eighties and nineties, and they realized that 90 plus percent of all their accidents. Initially on, aircraft carriers, but they extended beyond that and 93 percent were down to human error. So they created this taxonomy to look at, where was it? Was it the deliberate act or omission? Was it supervisory? Was it a, precondition? Was it a organizational influence? I've been doing risk for over 30 years, 30 something years now. And writing this latest book, Tales from the Edge. Surprised even me, because what I've found, and I knew it, but what I found was every single story kept following the same pattern, and it didn't matter whether it was a Ford Pinto or was Piper Alpha, Deepwater Horizon, it was all human factors. And it was all layers of events, moral hazard and just shared negligence, incompetence people managing their personal career risks. Will I get my quarterly bonus or will I get a promotion being more important than anything else? You saw this pattern, it would go back. It was never, I don't think I've come across a single fatality that I couldn't trace back the roots of it back 10 years and half a dozen or a dozen different barriers had to fail.

Yoyo:

Tell me about your research into the Ford Pinto story.

Julian:

the Ford Pinto story is a fascinating for those people who don't, are not familiar with it. The Ford Pinto was a fairly modern for its day sports coupe that Ford introduced to tap into a a market for, smaller, more fuel efficient cars and a younger market. So they created this from scratch now and had. A fairly unique defect in so much as if it was stopped at an intersection now bearing in mind America, they're driving on the right hand side of the road. So stopped turning left indicator on. If they're rear ended at that point, then what can happen is the sparks from the rupture of the fuel tank and the sparks from the indicator cause the car to burst into flames. It it became a bit of a meme basically where this would happen now. What's particularly interesting about it, and I won't go back To show that the difference is the contrast with the Tylenol case where this was another case in the States where someone decided to extort the Tylenol company by putting cyanide in some of their paracetamol bottles and they asked for a million dollars and Tylenol had a choice of just covering it up or paying the ransom or trying to track them down or, instead they Spent, the equivalent of a quarter of a billion dollars basically in modern money. Took all the Tylenol off the shelves, bought back anything anybody wanted, send it back in full refund, cost them, almost cost them the company. But it restored public confidence in Tylenol. To this day, it's probably the leading brand of paracetamol. That's not a, it's not a prescription drug. It's not a patented drug. It's just a generic, but people buy Tylenol above anything else because they trust the brand. By contrast, Ford, with the Pinto case, actually looked at a, the maths of it in an infamous memo, known generally as the Ford Pinto memo, that actually calculated The cost of bringing all these cars back in and retrofitting them with a recall as versus the cost of paying out for the deaths and they work out the cost per death. They worked out the number of times a car would be rear ended at an intersection. And in the most cold blooded analysis that's ever been made public at least they simply said we're not going to recall, we're just going to pay out for a few deaths. Now eventually this came to light as a, of course, as it would as a class action they received the most, I think 60 million, the highest ever payout was awarded against them for this gross injustice. So basically taking a leading brand, Ford, so iconic Model T Ford, had a number of cars before that snatched defeat. From the jaws of victory, if they'd just done a recall and said, look at us, we're the safest manufacturer because we stand by our products.

Yoyo:

Two incredibly good examples of how one decision, although very painful to begin with, ended up being long term, highly rewarding. And then the other decisions. So basically Ford got a couple of analysts in a room and said, what's the likelihood of this to happen? And then times that by how much it's going to cost us to pay out. That's that, that now would make me never want to buy a Ford. That's how harmful that reputation is. The fact that they could even think like that, you know, yeah, yeah, yeah. Let's look at, let's look at Palpa IFA. Piper Alpha.

Julian:

Piper Alpha. It's close. It was this big thing in the North Sea, now at the bottom of the North Sea, yeah. Yeah. Piper Alpha. Yeah.

Yoyo:

Tell me about, I mean, we've seen the movie. It's great that they make movie from crises, because I'm going to talk about another adaptation to film in a minute, which I think all people who want to be in crisis management should watch the movie around Piper Alpha, because it's like, the 70s movies, one where they're Parkland. The tower catches fire or earthquake in Los Angeles or Poseidon adventure. We all know something really bad's going to happen. And you find yourself going, no, no, don't do that. Don't do that. Don't go in that room.

Julian:

So Piper Alpha is fascinating. It's one of the reasons that I keep going back to Piper Alpha because it's now quite old, it's 1988. So essentially it was an oil rig in the North Sea which blew up catastrophically, killing 166 people and then one more died of wounds the next day. Now the entire rig was raised to the waterline basically, but it's such a well studied case. so many disasters are covered up or they're whitewashed or the truth never comes out to the public. You'll find in this book, I try to stick to things which were well documented, which I could actually put my hand on my heart and say, what went on there? And Piper Alpha, so to give you the short version of it, it's one of the most profitable oil rigs in the North Sea. It is at an intersection where another couple of rigs pump oil and liquefied natural gas, which is the sort of gas we use in our homes and our stoves. To Piper Alpha, it processes it and then ships it to shore. Nine o'clock on a Saturday night essentially, if you've ever worked in rigs, they're quite a complex series of processes, so they have a central control room. And one of the pumps trips out, and the operators look at the pump and say this is no good, we've got an auxiliary pump. It's just been taken offline for maintenance. But no work's been started on it, so we'll just switch on that auxiliary pump. Now, unbeknown to them, they had done a little bit of work, but the documentation hadn't made it to be cross referenced. Someone, a couple of guys had taken a pressure relief valve off. So of course, they switch on the second pump, it completely spurts under pressure, oil and gas out to an area which covers the entire production area until it finds a source of ignition. It blows up, kills two people to start with, and essentially then it starts. Burning, and it's burning in this one area now, Piper Alpha eventually melts the central main riser, which goes from a, what looks like an enormous, when you see the video of it. It looks like an enormous fire until you realize that fire was just a little like a house fire compared to a factory, the entire rig, now these rigs, they're big things, becomes enveloped in flame, and that's the point where it starts to drop. Now, to make all these things worse is that the permit to work system didn't work. That was a known problem, the permit to work system had failed a number of audits but never been fixed. The person who was working the permit to work system that night was a new supervisor who'd never been trained in how to use it, so they didn't get it across their, they'd recognized that they had problems with the deluge pumps, the water suppression system. And the problems were manifold. First of all, that they weren't suitable for the seawater, which was being pumped in there. And so they had a program to change it out. Now, interestingly enough, they had started changing out the pumps. This was four years, they changed out one section. But it was basically the lowest risk section. So the area where there was most likely to be a fire wasn't even on their schedule to have the pump sprinkler heads changed down at that point. Even if they'd changed them, they were working, that wouldn't have worked because the point where the pumps were activated was destroyed in the fire because it was in the area of them. Now, even if they'd got to the manual switches and turn on the pumps, they couldn't have turned them on because the pumps were locked out. And the reason the pumps were locked out was in the springtime, when they had divers in the water, there was a risk of divers being sucked into the inlet. Now the issue is that they, anytime divers were anywhere in the water, within 100 meters of the rig, they turned the pumps off. So not when they were near the inlets, but when they were anywhere. Divers would come out of the water at six o'clock in the evening, and this is nine o'clock at night, but they just never turned the pumps back on in the evenings. And then of course, then this area was in an area where the default for evacuation was for people to go to the accommodation area and wait for evacuation by helicopter. And that was what they were trying to do. So everybody, they know 80 people died in that accommodation area. They suffocated and they found those people in the bottom of the ocean the next day when they managed to recover that accommodation unit. But there was no leadership and to make matters worse, people were actually constantly opening the doors. even breaking the windows to let fresh air in, which only let the smoke in. So there's this whole litany of failures, and there's a couple of stories. There was one young chap who survived to be able to report it. And he said he came on Piper Alpha for the first time at 6pm one day. He was given a, basically a 10 minute induction brief. At nine o'clock, he's in his quarters and the fire starts. He survived because he got to a point where he decided to jump off. He didn't know where he was. And to this day, no one will ever know where he was, because you'd only just come to the rig. But he decided, and this is the sad part, it was better to jump to his death than to be burned alive, and so I think that litany, you can read in the book that there's 10 different things had to fail before this catastrophic failure. And again, even just to make it worse, one of the reasons that the explosion was so massive, there would have been only a few, half a dozen people killed, except that one of the other rigs continue to pump gas into the riser to feed the fire. And the reason this offshore superintendent did that, not because he wanted to make things worse, because he thought Piper Alpha will be able to deal with it. And when I looked into it, the amount of training that he had in crisis management and his instructions, what to do in an emergency, precisely zero. So this poor bloke had to live with the fact of it, he created just. And you would expect that people in that responsible area working in an oil and gas or high risk environment would have a rightful expectation to have some training, to run some drills and some exercises. And yet no. And of course, later in the book, you see the example of Deepwater Horizon, which actually isn't fundamentally all that different. You see the same types of failures, same, in fact, in Deepwater Horizon, the BP operators were there on the day to give a safety award to the Deepwater Horizon crew. And yet, when you look at these things, again, like Piper Alpha, the Permit to Work system had deficiencies that were known in audit. The blowout protector on Deepwater Horizon was an identified area. They had had a blowout warning, when it needed to be activated. It had failed due to something that had actually happened weeks beforehand that they knew about. And when you look at, I forget my numbers right there in the book, Transocean, who was the contractor to BP, who BP had selected to run their drilling program, they brought Trans Deepwater Horizon, the boat, or the drill, floating drill rig. Basically, they ran 40 percent of the rigs in the Gulf of Mexico, and they had over 70 percent of the safety incidents. Now if that wasn't a red flag for someone who's going out and procuring rigs all the time, like BP, it's, that's not, it's not out of the blue, it's not, this is not Black Monster, there's always a litany of

Yoyo:

things. Yeah, it's reeking to me of brown envelope business, to be honest with you however, there were a number of technical improvements weren't there that became mandatory after the Piper Alpha accident, and such as the fire protection of risers, pipes connecting oil platforms to underwater pipelines, better risers, and improvements. To subsea isolation valves and obligations to include automatic shutdown valves. Now you read that and you think, why are these even built without a shutdown valve? They're in the middle of the ocean. It's crikey, is it just me sitting here thinking?

Julian:

that was the thing that failed on deep water horizon. They had a blowout protector and you can read in the book, the litany of failures that were, should have, and could have been rectified. Prior to it, and it was the world's worst environmental incident still to this day, one good thing that did come out of Piper Alpha was this safety case legislation, and that was an interesting area where governments do have a role, where governments said, said to Occidental, the company, but essentially to the oil and gas industry, this was the British government, but it's now spread across the world. Either you sort this out or we'll sort it out and you won't like what we do because we don't know your industry and we'll just come in with a sledgehammer and we will legislate. And you will really not like it. So sort it out. And they did, they created safety case, which is essentially a way where you do a hazard operability study, hazard ID study, and then you document how you propose to operate this function safely. And a lot of industries use it. You'll see safety cases spread, nuclear power plants. chemical factories, even aviation. It's probably the single strongest example of a good safety or risk management framework that's become widespread.

Yoyo:

Let's round up with this final question then around health and safety, because in countries where. There is a greater deal of litigiousness, I should say. There seems to be a greater effort on health and safety.

Julian:

I think there are two categories of organizations and working oil and gas. They really get it. I love working with oil and gas because they understand that things blow up if you don't get it right. It's the same in the aviation sector, same in the nuclear power plants. I love all these very high reliability organizations generally, because they'll invest in it because they see an ROI. And, I said to a client recently that, what you need to do is do a training needs analysis and. And to sort out your risk management program is to do some training, invest in that. And I mentioned, work I've done with Woodside where we looked at training managers and health safety environment, looked at contract training, a whole lot of even risk and relationship management and negotiating skills and all sorts of stuff. This was a government agency and, my client's response was that's okay for Woodside, but they've got lots of money. I said, they've got lots of money because everything is done on an ROI basis. And if you don't take that foundational idea about the return on investment, if you use ROI as your foundational concept for training, you build a culture and you have unlimited. Resources for training, you get unlimited amount of money for health and safety and you can do whatever you want. Now, that's unfortunately a small group of organizations. And then where you do need government legislation and where it does work. in Australia, here we have industrial manslaughter. A director of a company can go to jail. If someone is killed on their worksite negligently, and that's a pretty big stick. And so that compliance requirement and sad to say, we were talking earlier about essentially looking at cyber security and it's a, do you use the compliance approach or do you take the best practice and take advantage of the opportunities? And most organizations in my experience are focused around compliance. They'll do the bare minimum. But for me, compliance is just a license to operate.

Yoyo:

Sometimes it can come down to personal responsibility as well, I was working in an industry where, I would turn up and deal with and speak to my employees and visit them and there wouldn't even be a risk assessment on site and it was drilled into us that if there wasn't a risk assessment and it's your area of responsibility to manage that location and something happens to that employee and there's no risk assessment. then I would be the one facing jail. Simple. Yeah. Because that's my responsibility. And I think there isn't quite so much of this. There were way too many places that I went to where there wasn't a risk assessment in place. Put it that way for me to start getting a little bit tired of the fact that it was, I the only one, because you can feel like sometimes you're the only one you're putting really good mitigation in place or just doing the job. I think sometimes you need to have, I think, I'm sure there's lots of employees out there that are thinking, I wish I had that too, where other people in my organization took responsibility for the fact that things should be done according to process with more responsibility.

Julian:

And a lot of people don't understand that big picture and they shy away from dealing with risk assessments because they're in their mind, if we write a risk assessment, then we've written something down that we have to respond to it, but if we don't write it down, we can ignore it. Honestly, I'm the opposite. My experience is both personally, a moral perspective and a legal perspective. And I used to be an ambulance officer and we had the saying and adage there was, if it isn't written down, it didn't happen. You know what I mean? You could take a patient to hospital and say, I gave them the best care, I gave them oxygen therapy, I splintered them, but if that's not in the documentation, you didn't do any of it. And that for me, it is with risk assessment. If you haven't identified the risks, and you might look at the risks legitimately and say, I think this is a 1 percent chance of happening, so I'm going to prioritise, I don't know, to pick an example, I'm going to prioritise road safety for our vehicle fleet ahead of the slips, trips and falls in the office. And I've only got X amount of resources, so I spend it on road safety. And yes, someone slips and breaks a neck, okay? That's bad. But at least you can say, I made a considered decision based on an understood risk.

Yoyo:

100%. So look, I'm going to wrap up now by saying that if anybody hasn't seen Five Days at Memorial, it's a five episode adaptation of Katrina from the perspective of Memorial Hospital that was in downtown New Orleans when the levees broke and the hurricane had happened and the floods were due. I say that as a risk professional, it's a great. Drama of what happened and it's depicting real life events. And there was some very significant long lasting impacts on me for sure. So for example, everybody was warned that the water was coming. Okay. That the levees had broken and the water was coming and they were even being warned in advance that the depth of the flood were going to be over 10 feet and and There's this scene in one of the episodes where all of their water, their bottled water was in the basement. And I'm like, get the water out the basement, get the water out the basement. Just, and what happened? The water comes and all of their supplies and food. and water is underwater. And I just, you can't help as a crisis manager, watch it and see so many learnings and so many mistakes. And one of the other most dramatic impacts, I think, was the fact that they shut the hospital doors. to anybody coming to them for help and salvation, and primarily they were people of colour, and there just seemed to be this kind of we can't let any more people in, any more people, it's going to jeopardise the comfortable situation we've got, and that wasn't comfortable. People were dying and there was a real serious risk to life and all those people that had worked so hard to just find anything to float on, to try and get to the hospital and were being turned away. It's, it was grossly hard to watch, to be honest.

Julian:

Yeah, I will check that. Yeah, yeah.

Yoyo:

Yeah, Five Days at Memorial. It's got a good cast in it as well.

Julian:

I'm surprised constantly I worked in Africa for a few years and you always had to be thinking, where's your plan B, where are you going to deal with if this happens, that's. I'll do a plug because I just published that whole story in a little book I just published called Tactical Risk Management, which is our next step to try and make risk management real. What do I do when the floodwaters are rising? Not this abstract spreadsheet. What do I do when I'm actually really in a burning building?

Yoyo:

Yeah, but then let's face it, you can look back and you can be pragmatic and you think these are surgeons and consultants. They don't know how to look at situational risk. Other than if it's in a human body and they can deal with the power going out and doing an operation or consulting, but they can't deal with anything outside that very niche, very specialist area of risk management on the body. And they couldn't see the broader picture. They couldn't plan ahead. They couldn't. They couldn't think, crikey there's the water's nowhere nearest now, but they didn't make any provision for the fact that it might be. And then what would happen as a consequence of that happening? And I don't care how busy the hospital is, you keep the doors open and it's a safe place for everyone. You can't treat everyone, obviously, and you can't help everyone, but it's still a sanctuary. For a safer place, so it's five days at Memorial. I always push that because I think as a risk professional as well, you look at that and learn a lot from what went wrong.

Julian:

The first book that a friend and I wrote was the security risk management body of knowledge. we used to have big ambitions. We're about to publish the risk management body of knowledge, and then we'll update. The Security Risk Management Body of Knowledge is a third edition, but SRMBOK now has its own website and its own newsletter, so if you'd like to stay tuned to the best security risk management newsletter in the world, self proclaimed by the editor it's SRMBOK. com, SecurityRiskManagementBodyKnowledge. com, you'll get to hear about this cool podcast that Yayo does, for example, you're in the The next newsletter coming out tomorrow. Yeah,

Yoyo:

and I subscribed. And it's very easy to subscribe. You'll subscribe to the Security Circle at www. ifpod. org and catch all the latest news. I love your newsletter, Julian. There's an opportunity. Everyone's pushing out newsletters right now. And I think you've just got to be very brutal and focus on the things that you want to learn. And if anybody is interested in a really, really good newsletter, definitely get Julian's because Julian, you're a beautiful writer and it's lovely to read what you've written and good luck with the new book. We'll obviously push that out security circle, huge fans. And thank you so much for being a guest on the security circle today.

Julian:

My pleasure. Thank you so much for having me.