
The Entropy Podcast
Nibble Knowledge is delighted to bring you "The Entropy Podcast"—hosted by Francis Gorman.
The Entropy Podcast centers on cybersecurity, technology, and business, featuring conversations with accomplished professionals who share real-world knowledge and experience. Our goal is simple: to leave you better informed and inspired after every episode.
We chose the name “Entropy” because it symbolizes the constant flux and unpredictability in cybersecurity, technology, and business. By understanding the forces that drive change and “disorder,” we can create better strategies to adapt and thrive in an ever-evolving technology and geo political landscape.
Disclaimer: The views and opinions expressed on all episodes of this podcast are solely those of the host and guests, based on personal experiences. They do not represent facts and are not intended to defame or harm any individual or business. Listeners are encouraged to form their own opinions.
The Entropy Podcast
Bridging Safety and Security in Engineering with Dr. Bob Oates
Discover the unexpected synergy between safety and security engineering as I chat with Dr. Bob Oates, Associate Director at Cambridge Consultants. Prepare to learn how these often considered distinct fields align in their ultimate mission to prevent harm. Bob takes us on an insightful journey, sharing insights across his distinguished career, Bob leverages a fantastic example of the groundbreaking project delivering the world’s first commercial remotely operated ship he worked on whilst with Rolls Royce to bring the topic to life. We also unravel the vital role of systems engineering in managing risks and ensuring seamless communication among diverse teams.
Together, we explore the many considerations of safety and security in industries where the stakes are high, like shipping, aviation, and finance. We also speak around developments in post-quantum cryptography, to understand potential threats such as "harvest now, decrypt later" and the necessity of updating cryptographic practices. Bob and I emphasize the importance of preparing for these advancements, drawing parallels with safety systems to highlight the need for traceability and effective stakeholder communication.
Takeaways
- Safety and security aim to prevent harm, whether accidental or malicious.
- Quality is the foundation of both safety and security.
- Systems engineering is crucial for connecting safety and security disciplines.
- Remote operations introduce unique safety and security challenges.
- Connectivity is a critical component for safety in remote systems.
- The integration of safety and security can lead to complex design challenges.
- Cryptography is essential for maintaining security in operational technology.
- Quantum computing poses new threats to traditional cryptographic methods.
- There is a skills shortage in both security and safety engineering fields.
- Understanding the value of cryptography is vital for stakeholder engagement.
Francis Gorman (00:02)
Hi everyone, I'm Francis Gorman and this is the Entropy Podcast and I'm delighted to be joined by Bob Oates, Associate Director with Cambridge Consultants. How are you Bob?
Bob Oates (00:12)
very well thanks Francis thanks for having me on board
Francis Gorman (00:15)
Great to have you as our very first guest on the show. It's been a bit of a roller coaster year so far in the technology and political world. We've had a 500 billion investment announcement in AI. We've had two trillion wiped off the stock exchange. Meta has stopped doing fact checking and Google are changing names and maps. And you know, the entropy podcast has dropped. So, you know, we're only a few weeks in. So that's all that's all good stuff, I suppose.
Bob Oates (00:38)
All
massive events in security. All of them.
Francis Gorman (00:44)
So
Bob, we're going to talk a little bit about security engineering and the interlock with safety engineering. I know you've got some really good examples in this field and it's a topic that probably doesn't get discussed a lot in terms of the wider technology landscape and how we nit together get our disparate technologies for a singular outcome. You had some really interesting experience with Rolls Royce and Svitzer in your past and I think maybe we could talk a little bit about that project if that's okay.
Bob Oates (01:12)
Yeah, absolutely. I guess I think it's worthwhile talking a little bit about just safety and security in general, then I can explain how that plays out in a project. That's probably a good way of starting that. Safety and security are really different disciplines, but they're rooted in the same objective. We're trying to prevent harm, whether that's accidental or malicious.
And to do that, you've got to manage risk in technology. That's it. So they align beautifully. And where they connect really is quality. If you can't do quality, you can't do security. And if you can't do quality, you can't do safety. Because as engineers, your only real lever for controlling risk is to set requirements and say, this system shall have a firewall configured in this way, or this system...
shall have redundancy to stop one failure causing a massive incident. And if you haven't got quality underpinning them, you can't be sure your requirements are going to be delivered. So they're both sort of rooted in this notion of quality delivery. The only real way of connecting them when they work on a project is systems engineering. You've got to take these multiple views.
on the system so that everyone is communicating and you can see what design decisions by one set of engineers has the impact it's going to have on everyone else. And for the most part, I would argue safety and security pull in exactly the same direction. There's two types of occasion when that isn't the case. It's when they generate churn.
and there's sort of rework and redesign and that can get really expensive very quickly because they push you more around more and more design iterations pulling in different directions and occasionally they pull in literally opposite directions they want just different things out of the system. So yeah with that in mind we worked on what is the the world's first
commercial remotely operated ship, which is a bit of a mouthful. And yeah, was really cool.
Francis Gorman (03:43)
Pretty cool though.
Pretty cool project for the CV,
Bob Oates (03:48)
It was such a wonderful project as well because everyone was really committed. We had a lot of support. It was long hours and tight time scales and really complex technology, but it is one of the happiest memories I have of being an engineer is working in that project. was really, really, really great.
One of the things, because we were delivering this very complex project and it had to be commercial, what that meant was we had to get it regulated. We had to go to the classification society and demonstrate that it was safe and secure. We had to go to the flag state. And I always think this is quite an interesting thing that security doesn't do as much, but safety has really got down, is the generation of, they call it a safety case, which is like an evidence-based argument you've done enough.
I don't think the security community does that as rigorously as they could in certain situations. In this project case, we actually tried to mimic it, we tried to blend them, because when you've got a tugboat, which is basically the biggest set of engines you can find loaded into the smallest ship that can hold them, that can do a lot of damage. There's a lot of thrust in those systems.
pull really big ships. And there's a real benefit to try and make them remotely operated there, because actually, if a big ship gets in trouble, it could pull a tugboat under. And that's dangerous for the people. There's a reason we have this expression, worst things happen at sea. So the being at sea is dangerous. So if we can try and reduce people's risk, that's a good thing.
Francis Gorman (05:45)
And Bob, when we think about that, so you're talking a pretty big vessel here in the project that we're discussing, you don't have a captain on board the ship, you know, it's all remote. How does that work? How does that hang together? Like, that's a pretty big safety concern or even in terms of a new era of technology meeting, you know, automotive infrastructure and meshing it all together and putting it in water.
Bob Oates (06:00)
So.
Francis Gorman (06:15)
There must be so many different considerations there in terms of the safety and security aspects all within a singular program.
Bob Oates (06:23)
Absolutely. you really, and I should say we always had a safety crew on board for our trials and testing because it just wasn't, it wouldn't have been right not to have done the sea trials with someone with a big button to be able to say, no, I'm going to take control. yeah, you're right. There was a lot of things, there was a lot of complexity to manage.
Francis Gorman (06:38)
That's reassuring.
Bob Oates (06:50)
You know, you've got connectivity. If your safety is reliant on being able to communicate between two points, then of course, your that coms channel becomes a sort of a safety critical component. And that's that's a challenging thing to do. Your average, you know, your mobile phone provider is not going to sign a contract to say that they will definitely maintain connection all the time. So you've got to think about that. We.
I that one of the biggest technical challenges actually was this intrinsic connection between the safety and the security. If something is, if a highly connected digital system is not secure, how can you possibly say it's safe? Because the safety relies on the integrity of the software and the data and the network behaving in a deterministic
way within bounds. if you allow a malicious attacker to run riot in that system, your safety argument just falls apart. that was the really interesting thing for me was how do we, when those things are so entwined, how do we make sure they work well together? And also, safety wasn't the only set of goals. When you've got a remote ship,
There's a lot of intellectual property, there's a lot of interesting code. So while safety is obviously the most important goal, you've got other security concerns that you need to worry about. you've got to balance. Yeah.
Francis Gorman (08:31)
I really want to see your Trap Model.
The architect in me is dying to see all of the components broken down with this.
Bob Oates (08:40)
And we had to do that. And there was a lot of really good systems engineering. I'm a big fan of something called model-based systems engineering, where you have a single coherent model and people can look at it with different... There's a lot of great tools out there now where you can draw these architectures out, map out the comms system, and it will show you that system in one way if you're a safety engineer and you're used to seeing diagrams of a particular sort.
And it'll show you in a different way if you're a security engineer, but you're looking at the same underlying data model. And that's a really powerful tool for that sort of thing. I'm just sort of running through some of the things. Maybe I should give a very specific example that I think is quite a nice thing. So we had a set of very specific conditions that we wanted to avoid. And the safety team, whenever you see safety and security connecting,
Francis Gorman (09:28)
Please.
Bob Oates (09:38)
that usually is how it works. The safety team are really good at identifying horrible things. that is part of that part of their mindset. And you see it in aerospace as well. So the aerospace standard is a ED202A and it gives you this flow diagram of how the safety team and the security team are supposed to share information, which is really useful. And it starts with the safety team saying, I don't want this thing to collide with the ships. I don't want it to...
lose sensors when I'm in the middle of the ocean, know, all of these scenarios. And then it's the security team's job to make sure that they are preventing those undesirable situations from manifesting. So you sort of trace it backwards.
Francis Gorman (10:25)
That's a really interesting concept that the two streams running beside each other. Is there something there we should be looking at in the wider industry view of the world? You you're talking about highly regulated systems engineering in terms of ships and planes and drones and, you know, stuff that when it goes wrong, it goes really wrong.
When we when we look at other industries, pharma, finance, et cetera, you know, that dual stream of of safety and security and mesh it together. Is there is there a framework in the making there? Is this this SABSA three point O or something that we could potentially cobble together?
Bob Oates (10:59)
I
it's not dissimilar to the really core NIST R &F where you just basically say, you we've got to know what you've got. What are your crown jewels? What do you really care about? Because if you try to secure everything to an equal level of rigor, you'll blow through your budget and nothing will end up being properly secure. So how do you prioritize the things that are really valuable to you? And in traditional cyber,
we would do that by doing a CIA assessment or something similar and say, are the impacts, the system level, the business level impacts of those failures? And we'd prioritize accordingly. It's just a slight extension when you've got safety involved, where they say, well, here are the anti-patterns. This is what I really don't want to happen. Could you, and actually the safety engineers, of course, they'll also do a lot of analysis about
how specific system failures map up to system-wide effects that are undesirable. you know, if that bolt wears through, what does that mean for the propulsion system and that sort of thing, and what does that mean for safety? So they're really good at making those maps. And then it's up to you as a security person to go in and say, there's big, in that big tree of failure mapped to disaster,
are there digital components and what happens if someone, if they lose control of those bits of software or that data isn't where they, what they thought it was and can I safeguard that because those are the really important things because they're sat in that chain of causality to disaster. And I think that's, again, it's not dissimilar to the way traditional security works. It's just a slightly different perspective.
Francis Gorman (12:55)
It's like we take the kill chain and security in with the what we really don't want to happen lens and you know, wrap around your threat model and you've got a dual lane approach to something that's really going to give you a fantastic outcome. When the project came together and you stepped away, there must have been an awful sense of accomplishment and pride to see it, know, taken. I want to take a flight because that's the wrong analogy, but set in sail on its maiden voyage.
Bob Oates (13:16)
Yeah.
Yeah, absolutely. We were very close-knit team and it was an amazing day. As you do, you have these days where you invite various dignitaries to come and see it. There was a real sense of pride that we'd achieved in a surprising timescale.
Although of course you can't always, as a security person, you're always sort of cursed in some ways. There's no pretty pictures to show people. You can't really go to a very senior manager or very senior stakeholder and show them ones and zeros flying across the screen and say, I did that. It's really good. It's very difficult. The bit of that project I found most interesting was the intrusion detection system.
Francis Gorman (14:11)
you
Bob Oates (14:20)
And I would have loved to talk at great length to anybody who'd listen about the design of the intrusion detection system. But yeah, it's not for everyone. But it was actually a really good example of how safety and security interact, because with intrusion detection systems, you're looking for anomalies. And actually, by design, safety incidents are anomalous. You hope so anyway. You don't want them to be the norm.
So if you go purely on anomaly based detection, you're going to start raising alerts when there's a safety incident. And that's when you want to be bothering the operator the least. You don't want them to be having to manage a genuine incident and then flagging up loads of false positives saying, the network traffic's doing something unusual. So that was a really interesting thing. And of course,
practical concerns that you don't get in an office. We had to use fiber optic cable for all the networking. And the reason for that is ships get hit by lightning all the time. And intrusion detection systems touch every other system. So if they get like electrically charged to a high degree, they can damage everything and that can wipe out the ship. So again, it's another safety concern that you wouldn't necessarily think about if you were just securing an office.
Francis Gorman (15:43)
That is brilliant. You never think so. The guy who came with the copper
is in big trouble when he lands on the ship. That's fantastic. Yeah. OK. So hit by lightning out in the sea. Yeah. Fries all your systems if you wired wrong. That's just another another real gotcha if you don't really think these things true at a macro level. And the anomaly pieces is fascinating, Bob, in terms of, you know.
Bob Oates (15:48)
Yeah.
Francis Gorman (16:07)
looking for the different patterns. I think we're in an industry at the moment where everyone is selling capabilities that, you know, we'll detect, we'll do a zero trust implementation, you know, we'll surface all the uglies in your databases and your applications and across your network, et cetera. But we know as security architects and engineers, unless you know what you're looking for, unless you have high fidelity, security relevant information feeding into your, into your SIEM and your SOC teams.
the likelihood that you will discover anything other than having a massive log store is probably very low.
Bob Oates (16:42)
Yeah, and I actually would say it's probably, I know we always like to think we've got the hardest job in the world, but operational technology, it's even worse because a lot of the suppliers of operational technology, they're very small organizations, they're very functionally driven, they want the device just to work and you want it to work straight out of the box. And I've seen some absolute horror stories, know, these beautiful, it's all...
ergonomically designed white plastic and blue LEDs and it's the latest and greatest version of this particular sensor array or whatever it is and it's running 15 year old Linux under the hood because because that works and that's always worked and why would I change it it's not it's not broken so so trying to change that mindset and when you plug those things into your network then obviously you end up with anomalies because they've lost essentially they've lost control of their software stack so they are
they're kicking out packets and they don't really, because it's a part, a legacy part of the operating system or even the newest stuff, if they misconfigure it and they're kicking out like IPv6 on your beautiful IPv4 network, that's going to raise some anomalies as well. Yeah, it's a real problem.
Francis Gorman (17:55)
You're triggering me now
on my IOT concerns. then, you know, when we we break down all of these things at a hardware level across across a ship, you know, different devices, then you've got to make sure that they're within your boundary of your network, that they're all protected, you know, you know, somebody that puts a system off the side door that has something in it, you know, I think I.
When I think about smart meters in the electrical grid at the moment, it keeps me up at night. I want to know is, we got a single cert to tend to authenticating in everyone's meter or have we got multiple different breakdowns in the authentication cycle and what can push what where and does it have a back door into my into my network if it all goes wrong? It's that it's that that OT and that IOT problem like you're you know, it's complex.
Bob Oates (18:44)
I in especially transports and maritime aerospace that I think the the advantage is that they tend to be quite rigorously designed. So you you know who should be talking to what in advance. That's one of the things we leveraged. You can draw a map fairly quickly that you know you you know that the the front facing camera probably shouldn't be sending packets to the engine controller. You know that's not that's not that you could you can put some really
logical functional parameters into that system definition. Whereas I think with enterprise systems, they do tend to evolve a bit more people connect devices that maybe they shouldn't or maybe, you know, because the people are under pressure, they've got to deliver so they connect, they connect the latest and greatest system to help them run them. Those systems evolve, I think a lot faster.
Whereas certainly in aerospace, you wouldn't change the configuration of the network in field without signing about 80 different legal documents and making sure that you were definitely the person you said you are and that sort of thing. in some ways we get some bits of that problem are easier. But yeah, it's certainly not plain sailing, if you'll pardon the pun.
Francis Gorman (20:08)
I like that one. Yeah, that
is definitely definitely a pun we can we can end on in terms in terms of your wider experience, Bob. And I know I'm following your career on LinkedIn. I can see you're doing lots of warm up talks in the post quantum space and getting into the detail. And I know we've had a couple of conversations in post quantum in the past. And, you know, that's that's that's a real future challenge, probably not well understood at the moment. AI is still there.
Bob Oates (20:12)
you
Francis Gorman (20:37)
the predominant player in terms of what everyone wants to talk about, but there's a real risk. And when we see what's happened with Deepseek coming out of the blue to kind of undermine an entire industry this month, you have to know what else is cooking behind closed doors in the quantum space. Do you want to give some views on where you see cryptography heading for and some of the gotchas that people may not just understand or realize are coming probably quicker than we can determine, even though there's no defined timeline?
Bob Oates (20:59)
Yeah
Absolutely. it's, yeah, it's another wonderful complex problem. know, that's why that's what keeps us getting up in the morning. yeah, it's a and it connects to the OT. So I will try and talk about how I think it also connects to the OT world. So in terms of the threat, obviously, we've got Harvest and Outer Crypt later. We've got people who are confident enough that they're going to be able to read intercepted traffic. So they're storing it en masse.
hoping that they'll be able to read it at some point. And I'm not, well, it's a pressing threat, but at the same time, we have to balance that quantum computers are slow, and they're always gonna be slow. They're not gonna be solving the data deluge problem anytime soon. They're really good at computational intensive, but low data problems. So.
quantum computers are going to do amazing things in the future. We should always focus on this as a really positive technology. It's just that one of the things they're really good at is essentially calculating the private key from a public key. Which is a nice, again, low data, high computation problem. So if you bear that in mind, the harvest now to crypto people have got a really hard job. They're going to have to think
very carefully about what they want to decrypt because it's going to take them, even when they've got the machine, it's going to take them quite a few hours per key to start that decryption process. So I think the people who are most at risk are probably, you know, nation states being targeted by other nation states. That's the reality. But as it gets more common, we do need to worry about the thing that really
makes me quite nervous is digital signature based attacks, forging certificates and a lot of operational technology as well as enterprise technology. You your entire security premise is that to stop people executing arbitrary code is that you can use a digital signature to demonstrate that it comes from someone you trust. And if you can run whatever you like on
a PLC or PowerShell or whatever you want to do. That's obviously a really disruptive problem. And those certificates are not updated very often. We assume that they can have lifetimes of maybe a year, years. And that gives the attacker lots of time to compute forgery. So we've
That's the one that worries me in the longer term, maybe medium term, let's say. For OT, that's going to be real problem because all of these operational technology systems are really running on a knife edge. The new post-quantum cryptography algorithms, they do have different properties. They will need different types of computation. So they'll introduce latency because you've got to...
process potentially some of them are faster in actual fact but it's all very difficult you have to you have to be aware of what you're introducing into your system because control systems latency can actually destabilize them but will stop working bandwidth as well the like key exchange takes about an extra kilobyte per handshake which is doesn't sound like a lot but if you're operating at scale and you're doing lots of handshakes or you've got this tiny network that's really very busy
Yeah, so I think people are going to need to think very carefully.
Francis Gorman (24:55)
you're going to topple your network
devices and cause all sorts of intermediate problems for your business.
Bob Oates (25:00)
Yeah. It's the risk of
action as well as the risk of inaction. That's the problem.
Francis Gorman (25:07)
And I think that's
a really key message. you know, been spending a lot of time talking about quantum lately and, what it means and, you know, try not to be the fear monger, but also have some realities around it. And I think for a lot of organizations, even understanding what cryptography means to them in terms of their inventory and their visibility and their understanding of all of the above is the first major challenge. then, you you get that visibility and maybe you
go,
we're not even at today's standards. Never mind post quantum. got to, we got to do some work here, you know, and you've got to bring stakeholders on a journey. So it's discovery and then it's stakeholder management. Then you've got to get programs that work together. You know, the, years to quantum are probably going to catch up on your efforts, you know, at some point. So it's a, it's a really, it's a really nuanced, nuanced problem. And I think if we, if we go back and we talk about Y2K, a lot of people.
speak of Y2K like a non-event, but reality was there was a lot of preparation behind the scenes to change programs and systems and different things that when it came, it was a bit of a non-event. I remember being entirely disappointed when the clocks changed and everything didn't blow up around me. you know, it's it kind of has some I see people calling it a time to Q day, which doesn't sit right with me at all. But, you know, it's year a Q or whatever we're going to talk about. But there's definitely
Definitely going to come a marketing term and I we need to be careful to make sure the realities of the work that needs to be done come true because regardless of the ability or speed, understanding your cryptographic inventory and where it sits and mapping it all out is something we probably should have everywhere anyway.
Bob Oates (26:53)
Yeah, and there's actually there's a lovely lesson we can learn from safety again, that when you deploy some new technology in a safety system, even if it's a control to make you safer, they have traceability. They can always trace it back to justify why did I add that extra complexity? What's the what's it giving me? And we need to be able to make the argument about cryptography to our stakeholders to be able to say
this is the value of the cryptography, this is what it's doing for your organization and helping you reach your goals. Because otherwise you're asking them for investment and it's very easy to feel like there's the security team again, there's another pressing threat, yet another, we've only just dealt with AI, what are you talking about? So I think yeah it's really important to be able to articulate the value of that stuff as well as of course knowing
where you've actually got cryptography and what you're doing with it.
Francis Gorman (27:51)
And then when we tell them when quantum compute meets AI, God knows what's going to happen. We need a whole other budget to deal with that stuff, you It's definitely plenty of work to do over the next couple of years to keep ahead of things. It's not getting any easier.
Bob Oates (27:55)
Yes.
But again, I noticed that people always talk about the skills shortage in security and actually there is a skills shortage in safety that, it's really hard to get people to engage with those disciplines because they look hard, they're scary, you know, there's a lot of responsibility. But it's such a fascinating discipline. People should be told of these stories about how amazing the projects you get to work on.
And you get to do everything because you have to have a have to have a least a surface level understanding of every bit of your organization or certainly every bit of the system you're trying to protect to do it effectively, which means you really do get a great bird's eye view of some really cool stuff. I would encourage anyone if you're in security or safety, go and go and tell whoever will listen about how amazing.
your career has been, how enriched your career has been for being able to play with this cool tech.
Francis Gorman (29:16)
And it is true. think if I look back at my career today, the amount of amazing projects I've been involved in and you you kind of forget about them and all of the challenges, you know, you know, we had cloud emerged and that created a whole lot of problems. And then we had AI just jumped out of nowhere and then quantum appeared and go back even further. there's, there's all sorts of stuff everywhere. And then when you look on it, that's, really where you learn, you know, you fail a lot and you make plenty of mistakes and you kind of go well.
a little bit better from that now, even though it hurt a bit at the time. I said some things a couple of years ago that no longer hold weight. But it is fascinating. It's funny, think, when I'm back home and I'm talking to people, they still think I'm the security man where I work. I'm standing at a door letting people in and out. They don't really get the discipline at all. My wife...
wife looks at me, I try to tell her what I'm doing on a day by day basis I can see your eyes glaze over. you know, it's probably an underappreciated discipline as well when you have a cybersecurity profession or a safety profession. doesn't it doesn't sound wonderful when you start talking about application level controls or SQL injection attacks, know, input validation or any of these things. But, you know, when you bring them all together in the picture and then someone gets a nice news story, you can you can make it sound a bit cooler.
Bob Oates (30:37)
Yeah, and although I do worry now, you've just said, you know, the things change, your understanding of the technology changes over time. And here we are recording for posterity what we think today. It'll be interesting to come back to this recording in a couple of years and ask ourselves, do I still believe all those things about quantum computing?
Francis Gorman (30:58)
We'll just blame AI, We'll just say we were impersonated.
No one can fact check it, but anyway, the way things are going. it's all good. Look, Bob, thanks very much for talking to me. Thanks very much for taking part in very first episode. So really appreciate it. And we'll be keeping a close eye on you in the future. If we survive episode one, we might have you back in the future to talk again. Fact check this episode. See how we get on.
Bob Oates (31:04)
You
Well, thank you very much for having me. It's been really fun.
Francis Gorman (31:31)
Thanks a million. Have a great evening. Take care.
Bob Oates (31:33)
Cheers. Goodbye.