The Entropy Podcast
The Entropy Podcast is a cybersecurity, technology, and business podcast hosted by Francis Gorman.
Each episode features in-depth conversations with cybersecurity professionals, technology leaders, and business executives who share real world insights on cyber risk, digital transformation, emerging technologies, leadership, and the evolving threat landscape.
Designed for CISOs, IT leaders, founders, and professionals navigating today’s digital economy, The Entropy Podcast explores how organizations can adapt, innovate, and build resilience in an era defined by constant change, disruption, and geopolitical uncertainty.
The name Entropy reflects the growing complexity and unpredictability of cybersecurity and technology ecosystems and the strategic thinking required to thrive within them.
Topics include:
- Cybersecurity strategy, risk, and resilience
- Post Quantum readiness
- Emerging technologies and innovation (AI etc).
- Business leadership and digital transformation
- Cyber threats, regulation, and geopolitics
- Lessons learned from real-world experience
New episodes deliver practical insight, expert perspectives, and actionable knowledge so you stay informed, strategic, and ahead of the curve.
Watch and Subscribe
You can also watch full episodes and exclusive content on our YouTube channel:
https://youtube.com/@nibbleknowledge-v7l?feature=shared
Achievements
The Entropy Podcast delivered strong chart performance throughout 2025, demonstrating consistent international reach and listener engagement.
- Regularly ranked within the Top 20 Technology podcasts in Ireland.
- Achieved a Top 25 placement in the United States Technology charts, holding the position for one week.
- Charted internationally across multiple markets, including Israel, Belgium, and the United Kingdom.
This performance reflects sustained global interest and growing recognition across key podcast markets.
Audio Quality Notice
Some episodes may feature minor variations in audio quality due to remote recording environments and external factors. We continuously strive to deliver the highest possible audio standards and appreciate your understanding.
Disclaimer
The views and opinions expressed in The Entropy Podcast are solely those of the host and guests and are based on personal experience and professional perspectives. They do not constitute factual claims, legal advice, or endorsements, and are not intended to harm or defame any individual or organization. Listeners are encouraged to form their own informed opinions.
The Entropy Podcast
Defense Stack with Tim D Williams
In this episode of the Entropy podcast, host Francis Gorman speaks with Tim D. Williams, co-founder and CTO of ProteQC, about the evolving landscape of cybersecurity, particularly in the context of post-quantum cryptography. They discuss the importance of learning from past mistakes, the economics of security architecture, and the critical role of cryptography in protecting data. Tim emphasizes the need for organizations to develop a comprehensive cryptography strategy and the importance of human expertise in navigating complex security challenges. The conversation also touches on the impact of AI on security architecture and the future of cybersecurity education.
Takeaways
- Tim shares a significant learning experience from his early career in cybersecurity.
- Understanding the economics of security is essential for effective architecture.
- Organizations must prioritize cryptography in their security strategies.
- Pre-discovery activities are crucial for effective cryptographic readiness.
- Resource allocation in cybersecurity must be precise and well-planned.
- Estimating costs for quantum readiness is challenging but necessary.
- Human expertise is irreplaceable in cybersecurity, especially with legacy systems.
- AI's role in security must be carefully managed to ensure accountability.
- Education plays a vital role in preparing the next generation of cybersecurity professionals.
- The future of cybersecurity will require a multidisciplinary approach.
Sound Bites
- "Attacks always get better, they never get worse."
- "AI can't replace the need for human expertise."
- "We need to know who is in control of the agents."
You can find ProteQC website here: https://ProteQC.com
Cryptography course recommended by Tim:
Francis Gorman (00:01.856)
Hi everyone, welcome to the Entropy podcast. I'm your host, Francis Gorman. If you're enjoying our content, please take a moment to like and follow the show wherever you get your podcast from.
Today I'm joined by Tim D. Williams, the co-founder and CTO of Post Quantum Cryptography Professional Services Startup, ProTech, an organization with a prime focus on enabling pragmatic post-quantum cryptographic readiness. Tim is guided by the principle that attacks always get better, they never get worse. I'm bringing to bear over 30 years of experience spanning
Government, Critical Infrastructure and Financial Services, conducts insightful multi-disciplinary research at the intersections of cyber security, psychology, economics, legal and regulatory affairs. He contributes as a volunteer to professional bodies including ISC Squared and the British Computer Society. Tim also lectures at multiple universities, recently in Ukraine. Tim, it's lovely to have you here with me today.
TIm D Williams (00:55.767)
Thank you so much, Francis, for having me.
Francis Gorman (00:58.37)
Tim, we know each other quite well. So we're just going to get into it and toss around a few topics. You've adepted knowledge and I'm going to jump around and pull different facets of it as we continue the conversation. All fair game. So Tim, maybe to start off with, let's start with a tough question and work backwards. So you've had quite an expansive career, as I just pointed out there, across government, critical infrastructure, and financial services.
TIm D Williams (01:12.323)
I'm back again.
Francis Gorman (01:27.042)
What has been the most challenging cybersecurity problem you've had to work on and why?
TIm D Williams (01:33.283)
That's a really great question. I think I've had several challenges, some of which I can talk about, some of which I cannot talk about. The challenges which I suppose I can most easily talk about are the ones where I was unaware of my own agency and where I kind of made mistakes. And I always think that you make your...
your best learnings from the mistakes you make. so even before I was a cyber security specialist, I made quite a few mistakes which helped to shape my views on what's important. can remember one in particular was the first week I was in a new job at Dell in 1999, I'd just got married. And I was asked to do a data extract for export compliance purposes.
software called Vastiera from JP Morgan that was used for export compliance checking. And I was asked to do a 10 % extract from the global data warehouse. Well, for EMEA region, it was about 900 million organizations and data subjects. So I was taking about 30 million writing my Oracle store procedures and thinking I was quite clever because I knew how to do Oracle nested functions and recursion and stuff like that. And I'd done this.
sample extract with a small amount of data with row count on, all working functionally fine. I set the row count off, DF, disk free on the cell RS box to find out I've got some space. And I run this little procedure. And about 40 seconds later, I have various colleagues saying the global data warehouse is down and then other colleagues saying, well, the global data warehouse never goes down. It hasn't been done in three years. And has anyone done something?
And it turned out that I'd unwittingly spooned this extract into Brown Disc because I was unaware that I was using up a very precious resource on this sunny 10,000 in Austin, Texas. And yeah, I was overprivileged in my first week of the job. But that was just a point in my career where I wasn't a cybersecurity specialist. I wasn't thinking about the principal least privilege. I wasn't thinking about how important it was to educate people, to be aware of.
TIm D Williams (04:00.458)
what they're doing and yeah, so I was making a mistake and that was a painful mistake but one that obviously led me to the path of enlightenment seeing how important it is to have security controls and structures including both technical controls, making sure that people don't have more privileges they need and also organisational controls, making sure that people are trained and know what they're doing before you set them off doing something.
That's a case I can think of some others, but that's perhaps a nice example that I can safely talk about.
Francis Gorman (04:39.246)
It is indeed Tim and I suppose it does show the importance of overly permissioned actions. You you're able to take something that was quite harmless from your perspective and then bring down multiple organisations and you know, they're the best lessons because you know, the pain of that never leaves you. I have a few of those myself that I've definitely, I don't think to that scale, but I've definitely done a few things that have...
TIm D Williams (05:04.404)
You haven't caused a global outage yet. haven't left if you haven't caused a global outage. Actually, Dell was actually a great place for learning. I always feel like the two years there, the eight quarters there was like eight years because they leased everything in 13 week plans and everything had to fit into quarters. there was another time there, this was not me, they had, and this does have an Irish connection to it.
Francis Gorman (05:07.854)
Not yet.
TIm D Williams (05:33.656)
They put in an FTDI ring, a WAN around the world. the global WAN went through the Limerick data centers. They used to have European manufacturing 1, 2, 3 in Limerick. And there was a deviation between the requirement to put all of the cables into separate trenches. And some local person in Ireland thinking, trenches? Why not just put the two cables into one trench? And then you needless to say,
A farmer with a tractor and severed the trench and caused the global outage. again, was a lesson learned when you're doing data center networking. You want to see separate trenches for the cables and separate power from different power stations into your data centers.
Francis Gorman (06:25.782)
I severing cables is quite the problem at the moment when we think of the undersea cables and what's happening there. That's a whole other topic. Tim, you've argued that attacks always get better and they never get worse. So how should security architects be thinking when they design for resilience? And I suppose designed against the constant upward trend of attacker capability.
TIm D Williams (06:52.643)
Yeah, that's a great question. And I think it comes back to, you know, having a good understanding of security economics. I know the late Ross Anderson is considered to be the father of the discipline of security economics. I think to be a good security architect, you have to have that sense of economy about whether you're asking an organization to spend too much of its money now, and they need to have some of that money in reserve for...
things that are going to come along which they can't foresee at the moment. And so you're always having to think, how can I do the most economical thing for now and have some reserves and make sure that the organization is putting aside some reserves for the things that are the attacks that we don't know about now, which are inevitably come. So we have to always be very disciplined and try and get maximum security, if efficacy for minimum security input. And I think if we're thinking about
the economics of what we're doing, then that will make us effective. It's the same thing as a building architect. You'd want a building architect to have a good sense of the cost of construction and be able to supervise the construction, make sure that as things are being done, you know, they're signing off on the stage payments and the things are being done. Okay. You'd expect a good security architect to have a good sense of the economics of system and software construction and to be able to sign off on stage payments and also be keeping reserves for the future.
Francis Gorman (08:18.334)
Makes total sense. I've been following with interest your engagements with OWASP of late in terms of changes that they're making in where cryptography sits in the top 10. Can you talk to me a little bit about what's going on there and what your driver was?
TIm D Williams (08:29.805)
Mm.
TIm D Williams (08:34.883)
Well, that was quite coincidental. I was due to deliver a course. It was actually for the BBC Academy a couple of weeks ago. And in preparation for the course, I thought I'd better check the OWASP Top 10 and see if the latest OWASP Top 10 25 has come out. And they had the release candidate out. It was out for two weeks of review. think it was just between the 6th of November and the 20th of November. And just on the 19th of November, just before the closing date for comments.
I saw that they had proposed to reduce cryptographic failures from OWASP A2 2021 to OWASP A4 2025. So they're proposing to demote cryptographic failures. And I know that the OWASP methodology is very much based on looking at global reports of CVEs, common vulnerability enumeration, and then aggregating those to form a view.
and they are obviously getting historic data inputs suggesting that cryptographic failures have reduced in importance over the last four years. But I also feel that as a responsible organisation, they have to think about their methodology and have to think about to what extent they are looking back and to what extent they're looking forward because they publish and they're whilst top 10, 20, 25 final that says that cryptography has gone down in significance. That could be sending a very wrong signal to
the because the world actually needs to spend a lot of focus on cryptography over the next five to ten years and I think if OOS has not come out yet I'm very interested to see which way it goes. If they listen and they increase or at least maintain cryptography at two, two or one I'd be happy with. Not really happy with any reduction from two but if they listen that will be a result. The feedback process has worked. If they do
reduce it, I think that over the next four or five years OWASP will increasingly suffer reputational harm as a source of trustworthy information about vulnerabilities because just as OWASP is saying reduce the rest of the world, know NIST, NCSC, Europol, Quantum State Financial Forum, many bodies, FSISAC, so many interesting bodies saying that
TIm D Williams (11:00.343)
This is a once in a generation time when we have to pay more attention to cryptography. And I think there's a risk that OWASP could be out of line with what's generally accepted to be the priority.
Francis Gorman (11:14.447)
Tim, you have a new startup, ProTech, just make sure I'm pronouncing that correctly. Can you tell me a little bit about what your vision is and what services you're going to offer to the market?
TIm D Williams (11:20.461)
That's right.
TIm D Williams (11:27.427)
Thank you.
TIm D Williams (11:35.566)
Thank you, Francis. Yeah, we actually pronounce it ProteQC, just like the word protect, without, don't pronounce the C because people see the logo with the P and the Q and the C. So we're just calling it ProTech when we pronounce it. Yeah, essentially it came out of multidisciplinary collaboration between myself and some colleagues who were studying an MBA, an online MBA with Quantic.
School of Business Technology and Online Service Provider that aims to high quality postgraduate education at very affordable prices. on the team we had a lawyer, we had a marketing specialist, we had a lady, BJ Miller, who's Olympic gold swing champion, and we had another colleague in Canada and myself.
we brainstormed from our different perspectives the problem of post-quantum cryptography. And we did research into the growth in the quantum market and the sub-markets within the quantum field. And what we observed is that almost all of the organizations that are playing into this pain space, the problem of cryptography that's worked well for 25 years without having to pay for it, needing to be replaced.
an economic externality that people aren't budgeting for, people having to start paying for. And when we looked at what people were doing, almost all of the companies out there were trying to sell products, not services, they're focusing on products. And when we applied our sort of MBA analysis, a critical analysis, and we thought about the ideal strategy for any businesses to have what you'd call a blue ocean strategy, so to play into the space that other organisations aren't playing into.
we realized that going pure services and going vendor neutral was going to differentiate ourselves from any of the other incumbent players. we're saying, look, yes, we know that there will be a need for products, but what most organizations are going to need is experts, experienced, service providers to help them make sense of what they need to do, help them to interpret all of the...
TIm D Williams (14:01.079)
that's coming from outside, the regulations coming from outside, select the best products, integrate the best products into the market. And so we've decided to be a pure play professional services firm. We realized that services can't scale as much as products and we're not expecting to become a huge company, but we really want to a very trusted provider of services, particularly to the financial services industry.
Francis Gorman (14:25.561)
So pulling on that string, ProTech gets set up and you get your first customers in. you talk to me around what approach you would apply to those customers to help them start their journey or some tips and tricks that you believe are key and may not be considered in the mainstream? think we know what the standard pieces are, but what's the approach? How do people get themselves set up and set up for success really?
TIm D Williams (14:54.627)
It's great that asked that. That setup idea is really very much what we're thinking that you also need to have a pre-discovery service that helps organizations to prepare themselves for discovery because a lot of vendors are out there selling tools and saying just put a tool in, do lots of network discovery, do lots of source code discovery, and then you're going to find some cryptographic assets and you'll have a cryptographic bill of materials and you'll be able to do something with that. But actually,
If you take a complex financial services organization that's had all different generations of cryptography probably for 20, 30 years, a lot of what needs to be done is actually pre-discovery to understand what is discoverable. You're not going to be able to put an IP-based network tool onto say an IBM SNA network or an APPC network.
there'll be architectures that are unreachable with an IP based discovery tool. And you need to understand that. You also need to understand what constraints you might face if you're trying to deploy a tool, you what's it going to be the firewall rules and what are the technical constraints, but what are the organizational constraints? If people are saying, where is there a policy or is there a standard or a guideline that says you're allowed to do this and you haven't got policy or standard of guiding that says it's okay to deploy a discovery tool.
you're probably not going to get very far. So we would be very much emphasizing the need for preparatory set up activities to prepare an organization for discovery. So we'd call that pre-discovery.
Francis Gorman (16:36.058)
Perfect Tim, and I suppose a lot of organizations are looking at their business strategy now, their technology strategy, their cyber strategy. Is there a need for a cryptography strategy?
TIm D Williams (16:47.307)
I'd say absolutely, because if you think about cryptography, it's in general terms, it is the only technology that can reliably protect data when it's not in your hands. Almost all other technologies that are not cryptographically based involve physical custody. So if you have unencrypted data, you need to have physical custody of the data in order to know whether it's not it's protected or not.
protects you from integrity, protects you from confidentiality, but cryptography allows you to well encrypted data into the hands of others, knowing that even if they do their worst, they won't be able to break that cryptography. So cryptography is the only technology that allows you to control data that isn't under your direct physical custody. And when you think about what...
All organizations are wanting to do they're wanting to put data to distribute data to put data into different places into cloud service providers to transfer it to other organizations. But cryptography is the mechanism by which an organization can control access to that data at a when it's outside their physical custody. And I'd absolutely say all organizations need to have a strategy and it needs to be sustainable.
So it needs to be a strategy that will work for the required data retention periods. Very often the organizations don't think long enough about their data protection needs. In the banking industry, you know that the, well, particularly the European banking industry, you're required to retain customer records for five years after the end of a customer relationship. Now, if you have a secured loan, that could be a 25 year mortgage. It could even be a longer than that.
mortgage in Europe, sometimes 40-year mortgages, and then five years. So you're talking about retention periods that are significantly longer than the period for which you can get a contract from a credible service provider to look after it for that period of time. And even some, you know, several technology replacement life cycles. So you need to have a cryptographic strategy that can survive technology replacement cycles. So people really need to think strategically about cryptography.
Francis Gorman (19:11.44)
And thinking about that a little bit more, what worries you most about post-quantum readiness in today's world? There's a lot of distraction from artificial intelligence. There's a lot of regulatory oversight with, especially in the financial industries with Dora and NISTU and all of these other aspects. Is there a lot of distractors that may lead to...
TIm D Williams (19:22.339)
Mm.
Francis Gorman (19:37.223)
procrastination in terms of the ability to act and what kind of, if I was to ask you what keeps you up at night in this area, what would that be?
TIm D Williams (19:46.414)
there you've mentioned touch on several things there. I think the thing I really worry about is organisations not putting, not devoting the resources they have in accurately enough. So if you have a pain point there is a tendency to establish a budget, to establish some compelling need to do something, but then for that budget to be dissipated by doing the wrong things. And you know if you get
non-specialists and people who are claiming that their technology will be some secret source that will solve it and then all the budget goes on that technology then the organisation won't have the reserves it needs to do what it doesn't need to do. So I think it's very important that the resources that are devoted to remediating legacy cryptographic technologies is correctly allocated.
And that precision in the allocation of resource, I think, is really what I worry about.
Francis Gorman (20:52.24)
And that's really, if we look at that a bit deeper, it's kind of a line in your tech refresh programs that they have quantum resilient hardware in them. It's about understanding your architecture roadmaps. So you're not adding to the problem and all of those different things. that's really key and that's great insight, Tim. Thanks very much for that. You touched on budgets there a minute ago. I think one thing that everyone has struggled to do is to...
TIm D Williams (21:06.147)
Yep.
Francis Gorman (21:20.888)
qualify what the cost of quantum readiness is going to be for an organization. If you were to take a wood from the trees view, there any pointers you could give the listeners around how you might be able to figure this out for themselves?
TIm D Williams (21:33.484)
I think it's going to be difficult because you're not going to have a lot of historic data. mean, even with the best one in the world, the normal way in which you develop future budgets is by looking to the past budgets. And most organisations just won't have enough data about what cryptography is coming in the past and how much they spent on it and how much it costs them to maintain it.
And so that lack of data is what makes it difficult. But in general terms, whenever you're facing uncertainty, you want to be using a mixed methods type of approach to the financial estimation. And there are several different techniques which you can use, some of which are kind of more top down, some of which are more bottom up. But if you use several different techniques and then you kind of take the average or
weighted average of the different techniques, you're more likely to to converge on a realistic number. So just talking about some of those numbers, if you take the number of customers you have in an organization, you can take some per customer estimates from organizations that have done effectively the equivalent of per customer estimates. So if you look at the US government and the UK government, they put out estimates over 10 years that
the cost per customer or per citizen, I should say, in the case of governments, is of the order of 10 to 20 dollars per citizen. This is for governments at scale. an organisation that doesn't have a population with tens or hundreds of millions may not achieve the same economies of scale. So the per customer may be higher, but you've still got this baseline. So maybe if you have a certain number of customers and you say, well, it's going to be a hundred dollars or a hundred euros per customer, you've probably got
a simple basis, top-down basis of estimate based on comparison. If you talk about the number of systems that you've got, then you could come up with some estimates based on the number of person hours that are going to take per system, and then you could, there are some metrics like from Carnegie Mellon University on the estimated number of late labour hours required to do post-quantum cryptography updates, depending on the
TIm D Williams (23:53.029)
low, medium or high complexity of the system. So that gives another basis of estimate. You could also look at constraints. You could look at, what is your infrastructure capacity to handle test environments and what's your staff capacity to handle post-quantum micro-ecrasions and then look at what the incremental cost of increasing your infrastructure capacity or increasing your staff capacity to perform quantum.
cryptography updates over a period of time would be. So these are just given three different basis estimates. I think it's fair to say there are other basis estimates as well and I'd be really happy to answer follow up questions about other basis of estimates.
Francis Gorman (24:38.65)
So what we're basically saying is no matter what way you go with this, once you get into size, the costs will increase and it's gonna be substantial. So enterprises need to start thinking about that in terms of their budgets and how they execute over the next couple of years. And I think you hit on something earlier on as well, which was, you know.
TIm D Williams (24:47.193)
Hmm.
TIm D Williams (24:52.408)
Yeah.
Francis Gorman (24:58.928)
professional services and having the right people capability to execute these changes. We're in a world that talks a lot about autonomous systems and automation and all of that sort of thing. But you really do need expertise from a human level to understand system architecture, et cetera. Like some of these systems, depending on the age of your organization, are going to be old systems. going to be legacy systems. going to need, you know, they're not going to be the flashy new cloud-based.
TIm D Williams (25:09.518)
Hmm.
TIm D Williams (25:21.636)
That's right.
Francis Gorman (25:28.448)
environments that some people are used to. So it's gonna need expertise, it's gonna need careful planning and it's gonna be the level of precision that you're not double spending. So that's great. I suppose if lifters want to learn more, can hit up your website and we'll stick the link into the details.
TIm D Williams (25:35.47)
Hmm. Hmm.
TIm D Williams (25:45.41)
Yeah, thank you, Vanses. Actually, just picking up on this on AI, if you think about AI as a technology, obviously, there's been huge developments in AI, large language model AI in particular, last couple of years, I'm regularly using AI agents to augment my human capability, you might call a central.
where the human is made stronger by using the technology rather than the reverse center, which Corey Doctorow talks about, how if you're asking humans to do something they're not very good at, that human machine combination is weaker. He gives the example if you're expecting, say, a car to drive autonomously, but then expect the human to spot when the AI isn't working and step in, that's a reverse center scenario. But if you had a situation where
the car was monitoring the human because the human might forget to look in the mirror or might forget to indicate before changing lanes. And then the AI is strengthening the human. That would be a central situation. So I'm very often using AI, but I really want to point out the fundamental things that AI can't do. Now, the main things we need to think about, AI is not embodied and it lacks our senses. So AI deals with effectively relatively low
fidelity visual data, 2D, not 3D, not like our vision. Normally 2D, this mono audio, not properly 2D, not 3D audio like we can have. Doesn't have sense of proprioception, doesn't have touch, doesn't have taste, doesn't have smell. And most of all, doesn't have that multi-sensory integration that we have as humans. Also doesn't have embodied memory. So when we think about what a human being who's lived through several generations of technology can
think about computers, can remember our very first interactions with a computer. can remember, you know, BBC B, Micros and Atari STs and Amigas and research machines, 380Z, Vax. We can have this light lived memory of previous generations of computers and how they worked and what their limitations were. Now, if you think about that lived memory of that is not available to
TIm D Williams (28:10.604)
large language model AI. A large language model AI doesn't have a lived memory of this is how this generation of computers transition to this generation. It has loads of text, but it can't make sense of the reality. A human can make sense of that and you really need experienced humans to make sense of the complex reality.
Francis Gorman (28:32.897)
I think I might say just from the sprinkling of guests I've had over the year, it can't do those things yet. And I say, I say, yeah, for two reasons. One, one is we know large language models have a limitation and yeah, yeah, Yan Li Chin, Mehta's kind of chief scientist is leaving just for just for that reason that he's going to build the next wave that is not reliant on language and the equates language.
TIm D Williams (28:42.02)
Thank you.
Francis Gorman (29:01.955)
is not knowledge. I look at what Elon's doing in Tesla. He's building these bots with up to 50 actuators in their hands. He's got Neuralink. I saw this week we ran our first computer system almost purely on human brain cells that were grown in some lab in Silicon Valley. So we can't do it yet, but it could happen. And I think if we think back
TIm D Williams (29:07.044)
Near a link, yeah.
TIm D Williams (29:24.428)
It could happen, it could happen.
Francis Gorman (29:29.809)
2023 most people had never heard of chat GPT and now it's become the fundamental brainstorming and meme generation tool of the world. you know, where we look at it now, I think it's yeah, it has its uses, it has its limitations, but there is exponential investment across the board in AI at the moment. So I think, I don't think any of us knows where we're going to go. When I saw Tesla and the dancing robots the other day, I was pretty sure, you know,
there's a guy in a suit till I looked a bit closer and went, no, they're definitely robotic legs that are moving. it is terrifying. But I suppose off that team, we think about artificial intelligence and we think about security, it has completely changed the attack surface depending on the types of AI. So we know large language models, grand, we know people are probably jumping lots of sensitive data in there that they shouldn't outside the boundaries of your organization.
TIm D Williams (30:25.432)
Hmph.
Francis Gorman (30:25.711)
If you're doing that, please stop. But agentic AI is breaching the very rules that we wrote for cybersecurity on zero trust, know, least privilege. And now we're developing systems across enterprises that are autonomous with no interaction and have deep levels of permission and privilege across the organization of assets. When you look at architecture for AI, what are the things that we should be considering?
TIm D Williams (30:54.532)
It's really great question. think at the heart of your question is this idea of agency and agents need to be identifiable. So you shouldn't have agents that are unknown, that are outside governance, are, you know, for an agent to be present in an organization accessing its systems, access data, there needs to be a
a governance of those agents. And I know that it can be good practice in organizations actually to formally set up agents as if they were humans, have records of them and to have them to have line managers and for them to be accountable. And I think there's a growing need to ensure that the any human, any
computer agents are accountable to humans. And that's an essential part of updating corporate policies to extent we're not gonna have unauthorized agents, we're not gonna have unknown agents, unidentified agents in this organization, we're only gonna have authorized agents under our control. And that's necessary because otherwise those agents could be acting as proxies, as Trojan horses. You wouldn't want to have...
trojan horse inside your organisation acting on behalf of some nefarious threat actor so you have to know who is in control of the agents and who is behind them and you have to have governance of agents.
Francis Gorman (32:33.073)
If we got competing factors here with, I suppose, the inception of Dora, which is very heavily focused on resilience, and then you've got this drive of AI, get it in, you know, we need to AI it even if it's not the right thing to do. our organization is gonna face this funny dilemma where one half of the brain of the organization is just trying to AI everything, the other half of the organization is trying to.
ensure resilience and clarity and as you said, have accountability over the autonomy of these things. Are we at an inflection point where the way we think about security and the way we think about resilience is being challenged at its very core?
TIm D Williams (33:17.604)
I think you're right that there are deep challenges, but I don't think that the fundamentals have totally changed. mean, if you think back to the heritage of what we now call cyber security out of what was originally called information security, information security was the original discipline and the original information. It was recognized in the information security era that information could either flow
person to person and the organizational controls or it could flow through IT systems and the technical controls. That was kind of information security. The cybersecurity era has come about when we started to connect information systems up to things that have kinetic effects or that have effects on motion or have effects on flow of money. so cybersecurity is an evolution from information security. But the fundamentals of
Information flow control are still there. The fundamentals of understanding the required properties of the systems, the need for confidentiality, integrity, accountability, principles are all still there. I think we just need to think even more critically about how we're architecting our organizations with these more complex agents in them.
in line with the principles that we've already established. I think there could be a schizophrenia within a particular organisation if you have people who are not coordinating, so if the people working on AI are not coordinating with the people who working on resilience, that could be a problem. But then that becomes an organisational governance issue to make sure that there's coordination.
between the different parts of the organisation to bring them together.
Francis Gorman (35:22.482)
That's a really great answer to him. Thanks very much for that. It's one to mull over and I think it's going to be really interesting to see how that actually materializes over the next couple of years as the technologies continue to change at a ferocious pace and cybersecurity races to keep up, which is always the fun part of the game. I want to ask you a little bit about education. You have a passion for education, both consuming knowledge and
then taking your time to give that knowledge back. And it's always fascinated me, your capacity to learn and to recall. Where does that appetite for knowledge come from? And you almost feel like you're committed then to redistribute that knowledge once you've learned it yourself.
TIm D Williams (36:12.448)
Yeah, thanks for asking that question. really quite a personal question. It's about something to do with my self-identity. It's a combination of things. I my parents moved around when I was a child, which forced me to move between different schools and that kind of somewhat disrupted my experience in my childhood. And it meant I had to learn about different schools and different styles of...
teaching education. remember going from a prep school where we were being like crammed with, you know, French and Latin aged, you five and six into a middle school where they said, well, we don't think you should have lined paper because it stifles childhood creativity and just like try and just write on these blank sheets of paper without any lines. So I experienced, suppose, several different teaching styles and then I...
came into a grammar school a year late and had to catch up. And that was quite competitive. It was actually a really good school, Ripon Grammar School. There were some quite illustrious people who were there at the same time. Richard Hammond, the presenter from Top Gear, was a couple of years behind me. We called him Titch. And Katherine Granger, who's...
of the Guardian. So there were some interesting people at that school, but it was a very competitive place. They would put your marks up on the classroom to see what your ranks were and in subjects like sciences, physical chemistry, you'd be looking for your rank in a year. Very competitive. And so I had that kind of encultured into me, like constantly needing to learn and not rest on your laurels.
I kind of slacked a bit in my first degree because again, was just interested in learning, wasn't really interested in the academic outcomes. was just, oh, there's another subject, there's another subject. And I kind of took it. And at that point, I wasn't sure what I was going to be doing. I was studying psychology and pharmacology and doing various experiments, giving hallucinogenic drugs to various
TIm D Williams (38:38.402)
animal species. You know, what does a spider do with LSD? What does a fighting fish do with LSD? So we did kind of all kinds of interesting things. But then we also did neural network studies. What are the neurons of the planaria flatworm and can we model those? And we're using like Lisp and Prolog and early AI languages. So through a combination of random life events, I ended up with like a really
odd set of skills like psychology combined with coding, combined with some statistics and Fortran programming maths. So yeah, I ended up with just a mix of different skills which I then found ways of using in the workplace and eventually into cybersecurity.
Francis Gorman (39:34.419)
No, it really is fascinating, Tim. And I think each one of those experiences is a layer of complexity that you can kind of delve into and gives a different perspective. And I think that's why cryptography has been one of the things you've really excelled at is because it is a multidisciplinary complex ecosystem that underpins the modern world as we know it today. If I was to ask you for
somebody who's starting out their cybersecurity career, they're looking at quantum, they think there may be an opportunity there for the next several years to build up and to be cryptographers. Is there any low paid or entry courses or reading material across the internet they should be looking at as a kind of a start over 10?
TIm D Williams (40:22.5)
Absolutely. So specifically on cryptography, I would recommend the Stanford University's Cryptography 1 course. And we can make sure that link goes into the podcast. That's delivered by Professor Dan Bonnet at Stanford. And the reason why I particularly highlight Stanford, I it's available actually as a, I think, Coursera MOOC, so you can get a certificate for not very much money. But that course...
takes you back to the university where Diffie and Hellman and Rivers, Shamir and Edelman published the algorithms, we all know now as the basis of the asymmetric photography around 1974-75. They were actually a few years after the actual invention of asymmetric photography by Clifford Cox at GCH Cuba. Of course, he couldn't publish his papers in 1969-70.
and only came out later that asymmetric could probably have been discovered at GCHQ. But what's really interesting is the speculation that there might have been sort of conversations between people at GCHQ and Stanford that sort of led to Stanford making inferences. So it's possible that there was some sort of out of bands and out of channel communication that led to Stanford. But Stanford really has had that.
pedigree in their heritage. So if you learn the Stanford course, you're really learning from the place that brought cryptography to the world. There are still lots of other great cryptographers and great cryptography faculties. I studied my MSc in Information Security at Royal Holloway. Royal Holloway has some great cryptographers. was founded by, it was the first faculty of Information Security anywhere in the world founded by Professor Fred Piper in 1979 when there weren't any other security faculties and so
Royal Holloway is still really recognised. If you look on the ISE Squared website in the USA, it's one of the only British universities that appears in the drop down list. Royal Holloway is really good. Napier is excellent with Professor Bill B. Cannon. ETX Zurich, excellent for cryptography. But I would say Stanford is a good place to start.
Francis Gorman (42:41.804)
Excellent and thanks very much. So we'll definitely put that course. I think it's on course era. It's about about a hundred quid or so to sign up. that's yeah, that's value for value for money and get you a good a good start in the knowledge you need. Tim look, it's been an absolute pleasure to have you on. Best of luck with the startup. I'm sure your your knowledge and wisdom will be will be sought out by many enterprises as they start to grapple with what do we do next.
TIm D Williams (43:03.172)
Thank you so much.
Francis Gorman (43:11.441)
So I really wish you all the best with that venture.
TIm D Williams (43:14.926)
Thank you, Francis.
Francis Gorman (43:16.888)
That's it folks. That's the wrap of season one. It's been quite the journey from when I started this number of months ago. Thank you for all of the support, lovely feedback, comments, and I look forward to the next year of the Entropy podcast. But for now, I want to wish everyone a happy Christmas and a happy new year. And I hope you have time to put the feet up and relax. And we'll see you in January.
Francis Gorman (01:46.862)
So that's it guys, that's a wrap. That's the end of season one. It's been an absolute roller coaster year. We've charted in Belgium, in Ireland, in the UK and in the United States of America. It's phenomenal to look back on all of the wonderful guests I've had and I really do appreciate everyone who gave their time to the show. But for now, I hope you have a lovely Christmas and a happy new year and I'll see you for season two in 2026. Take care.