Journals of the Information Entrepreneur - Jacqueline stockwell
Welcome to "The Journals of the Information Entrepreneur"! Hosted by Jacqueline Stockwell, CEO and Founder of Leadership Through Data, this podcast is dedicated to empowering and inspiring information leaders across the globe. Jacqueline shares her expertise in revolutionizing information management training and delivering it in a way that captures the audience's attention and ensures their time is well spent. In each episode, Jacqueline engages with industry experts and thought leaders to discuss the latest trends, challenges, and best practices in information management.
Journals of the Information Entrepreneur - Jacqueline stockwell
051: The Data Protection Dungeon Master: Ralph O’Brien on Serious Privacy and the Quest for Trust
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
🛡️ The Data Protection Dungeon Master: Ralph O'Brien on Serious Privacy
Are you tired of being part of the "Department of No"? 🛑 It’s time to stop blocking projects and start building them.
In the latest episode of the Journal of the Information Entrepreneur, we sit down with Ralph O’Brien, a veteran with 25 years of experience in data governance. Ralph explains why privacy professionals must move away from being seen as "blockers" and instead become "builders" who help businesses find solutions.
In this episode, we explore:
- Personal Branding: How to align your career mission with your values to gain credibility.
- Privacy by Design: Moving past the theory to real-world action using eight key concepts like "Minimize" and "Inform".
- Leading the C-Suite: How to use positive language to show leadership that privacy actually helps business functionality.
- The AI Challenge: Why we need critical thinking and human oversight instead of fear-driven rules when using AI.
Stop being a roadblock and start being an enabler. 🚀
🎧 Listen to the full episode now! (Links below)
#PrivacyByDesign #DataProtection #InformationEntrepreneur #Compliance #Leadership
Hello, welcome to today's show. I'm Jacqueline Stockwell, CEO and founder at Leadership Through Data. I inspire and motivate information leaders across the world. Hello and welcome to today's show. I'm super excited to have Ralph O'Brien with me today. He's got 25 years of experience and he's transitioned from the front lines of global security to helping shape the very policies and standards that govern our digital world. He is the director of the Institute of Operational Privacy by Design, a renowned trainer for leadership through data. So we've been working together for some time now, and a co-host of the award-winning Sirius Privacy Podcast. In an exciting new chapter, he has recently rebranded his consultancy business to align fully with the Sirius Privacy family, bringing his decades of experience, strategic expertise under one powerhouse banner. And I love it. He is known to many as the data protection dungeon master. He doesn't see compliance as a mere checkbox. He sees it as a quest to prevent data harms and build practical, trustworthy governance. Whether he's advising multiple n multinational organizations on AI or auditing global frameworks, his goal is always to be a positive sum advisor who protects individuals while enabling businesses to thrive. So welcome to the today's show, Ralph. So congratulations, first of all. Let's talk about your rebrand. So how does aligning your consultancy work directly with the serious privacy brand change the quest for you and your clients?
SPEAKER_01Well, I'm not sure the overall quest has changed. The quest for me is always the same and has been for many, many years now to get all the benefits and excellent stuff we can get out of technology and data use, but without the hazards and harms. What happened really was it occurred to me that I'd set up my business Rainbow Consulting, but people kept coming to me as Ralph O'Brien, and the Rainbow Consulting was just a handy wrapper, really. It was more my name than anything else. But when I started working with Kay and Paul on the Serious Privacy podcast, you know, I I I basically got aligned with that. So we drew up a grand brand agreement. Kay registered it in the US, Paul registered it in the Netherlands, I registered it in the UK, and so it gives us the ability to sort of provide global support, um, to sort of align all our brands together. And we're gonna have some other exciting branded things on the way too in the future, which I can't say too much about yet. But yeah. Check it out.
SPEAKER_00I love that. And I just kind of want to just touch a few points on there, Ralph, because I'm a big believer about our own personal brand. So as information leaver leaders and pro so I call them that, but they're privacy officers and also records managers, that we are very good at our own personal branding, for example. And our personal branding is kind of what we believe, how we show up, how we present. And what I really like the way you described your description around Rainbow and Ralph O'Brien is that you'd actually built your own personal brand under Ralph O'Brien, and Rainbow was the wraparound underneath it. So I actually really like that you've kind of rebranded into something associating with your podcast, but I think it's sensational, and I think it's a really good thing for us to have our own personal brand against our businesses or our consultancy firms or even in organizations that we work with. Now, I want to talk about your your mindset. So you are famous for the data protection dungeon master. So, how does that RPG Inspire approach help you guide boards through the monsters and the traps of the global regulations that we are adhere to?
SPEAKER_01Well, first of all, you're making a big assumption that Dungeon Master refers to role-playing games.
SPEAKER_00I know you too well. So Hello.
SPEAKER_01Um, seriously, I mean it's really interesting. I've been doing a lot of work recently on sort of making things more accessible and gamification and threat modelling. And actually, I'm a bit of a geek. Most people would know this about me in my real life. I, you know, if if I want to relax, I paint little plastic miniatures and I get my friends together and roll dice. And there's loads of transferable skills, you know, an enjoyable experience, collaborative storytelling, imaginative creative thinking, ways to navigate new situations and hazards and harms, and have fun doing it, right? Um, so yeah, you talk about a brand, you know, actually bringing in some of my personal life, that sort of gamification, that sort of strategic thinking, and the ability to sort of get together and work together at a problem um in sort of a way of collaborative storytelling is really close to my heart.
SPEAKER_00Yeah, and that's why we work so well together under leadership through data, because we have the same values, vision, and um mission when it comes to things like that. So I think it's super amazing uh that you've got the same. Yeah, cheers, Ralph. That's that's that's my neurodivergent brain coming in, not thinking about the right word. So let's talk about leadership through data. So you have mentored the generations we've been working together, I think, for about nine years, I think it's been. What is the biggest soft skill privacy professionals are missing when they're trying to influence the C-suite?
SPEAKER_01Uh, personally, I think we need to get out of our own way. We are so passionate about words like privacy and data protection that we almost turn ourselves into activists. We kind of turn people off if we start saying no. I've got a big presentation that I normally do about being a builder rather than a blocker, being a designer, having solutions, not problems. And I think we become more valued in the room if we're seen as a positive influence, something that adds functionality, adds features that enable trusted uses of data rather than being the department of no, you know. Um so you know, brakes are important, don't get me wrong, but if you want to build a fast car, you need to build good brakes, and that's a feature you're adding. You know, you're adding functionality rather than removing it. In fact, even though I just read about in my business of serious privacy, which is an American concept, I do prefer the term data protection because you know privacy can kind of be a zero-sum game where you're kind of saying no, but data protection law doesn't really do that. It's about saying yes, processed data, but in a way that doesn't hurt people. So you're adding controls, you're adding technical and organizational measures. Um, yeah, so it's just it's sort of a a more interesting way of thinking about it than titbox compliance.
SPEAKER_00Yeah, agreed. And I think there's some really, really valid points there because we all know our stuff. We all know the law, how to apply it, but actually it's how you get that you know, leadership driven through the business, isn't it? And through organizations and through ourselves, and and actually how we show up, and it's how our behaviours are presented in organizations that we will be either the department of know or come help me build relationships and let's do some really cool stuff together. Yeah. So, as a director of the institute, how do we move privacy by design from a legal theory into practical operational reality for developers and engineers?
SPEAKER_01Well, I mean, I was first of all, it was just a compliment to be asked. I was actually heading their standards and um committee for a while uh out of the US, uh based in Florida with uh Jason Cronk, um, and they asked me to come on as a director and really shape their strategy. In terms of operationalizing, you know, a lot of people say they do data protection by design, but don't, you know, quite often in data protection divisions, because we're seen as that blocker, we're often thrown things and saying, well, make it legal or do a DPIA on it. And we actually don't have the opportunity to get in in the design phase, in the early phase. I'm I'm a big fan of the eight words from the little blue book. Now, this is something probably not a lot of people have heard of, but the little blue book is um has been designed for software engineering. Um and I actually run a presentation and an exercise on on one of Leadership Through Data course is about sort of designing a new product. And those eight words are aggregate, you can hide people in groups, hiding, like encryption and pseudonymization, separating out different data sets and different user groups, minimizing the data you collect. But then on the individual, on the user side, you can inform them with good transparency, enforce, get rid of bad actors, demonstrate you're doing it well with good badges, and offer people meaningful control. So actually, I often find when I'm doing things like DPIAs, it's not about pointing out the risks sometimes, but providing people the action plan, the meaningful outcomes. And if you can use those eight words to come up with design strategies and design suggestions and get in early with the coders and the developers and the business units and the IT departments, it's much, much better than being handed something later on in the journey where you're left with a software product or technology that even if you do make suggestions, it's too late to alter it.
SPEAKER_00So let's talk about AI. Love talking about AI. Everyone wants to talk about AI. Everyone talks about AI, but I need to talk to you about it with you. So, with your background in ISO 27 27001, how are you helping organizations navigate the black box of AI governance without falling into the trap of fear-driven compliance?
SPEAKER_01Um, well, 2701 is a part of it from the security aspect, building a plan to check at um information security framework. But actually, I want to take it a step back. Uh, far more important is the capacity of humans to take the time to critically think. Now, you talk about how we present ourselves. One of the problems with critical thinking is to the business, it can like waste of time, uh, a lack of productivity because you know, are we generating anything? And in the world of AI, the easy road is to let the computer take over, to let the computer answer our questions, to accept the outcomes of generative AI blindly. It kind of gets us in bad habits because we stop taking responsibility for the technology we unleash. The brain's like a muscle, right? So you've got to exercise it. You stop using it and hand over that cognitive function and it'll atrophy. Prince Harry said at the IAPP last week, actually, when discussing privacy on stage with Joe Jones, that power demands responsibility. And when you're processing at scale, where responsibility is absent, harm becomes inevitable. And so for me, in the world of AI, we're humans, and humans are so worried about whether they can do something that they never stop to think whether they should. And therefore the harms will come as an unintended consequence. So, generally speaking, my my my process is always can we think about this? And fear-driven compliance isn't a thing for me at all. In fact, we actually have very little to fear in the UK when it comes to regulatory penalty. So for me, it's it's more about what do you want to be, who do you want to be, how do you place your products, how are you going to engage the human and better serve the human? And that will bring you the business dividends right there and there.
SPEAKER_00So, what is the one thing about human behaviour regarding data that has stayed exactly the same since you started in the field?
SPEAKER_01Oh, I mean, humans are humans. You know, the problem with humans is actually we're sort of cognitively adapted for living on the West African plains, right? Um, but then we have these medieval institutions and the technology has handed us the power of the gods. So not a lot has changed. People are people. We're still driven by the same id and ego and drives. You know, in fact, when you look at the data protection principles, they haven't actually changed since since the seven What's changed for me is the scale. So when you used to make a mistake, the mistake was less. When I first started in my career with manual data, you know, we could have 5,000 paper-based records in the local government. And you might have a 5% error rate, and you go, oh my god, that sounds huge, a 5% error rate. But 5% of 5,000 is a lot less than what I was dealing with last week, which was 0.1% of 16 billion images. Now, if you do the math, a 0.1 of 16 billion is a lot more than 5% of 5,000, right? So it's a bit like the technology is morally neutral. The technology doesn't care what you do, the axe doesn't care if it's building a house or swiping at someone's head, right? And so it's how we use it, how we implement it, how we govern it, how we think about it. But as things become easier, we've got the capability for huge, huge benefit and huge, huge harm. So humans haven't changed. Their predilections are always the same, but it's the scale of the problem that's changed.
SPEAKER_00Yeah, I saw something on LinkedIn yesterday around how mice integrated with each others and how that they mated and then reproduced, and then they got kept in a in a box, and then all of a sudden they decided they didn't want to do that, and the males isolated themselves. And it was all around how people interact on social media. And I think the the volume of data that we have right now is just tremendous, right? So so if we think about the back to this the video, it was literally the mice then stopped socially interacting, they stopped doing things, and then they eventually kind of died. And it was more of a pre-sense around how um us as humans in our behaviors have gone straight to social media, absorbed, keep going, keep going through social media, and then all of a sudden there is this push to kind of come back now, isn't there? So that's the kind of the change of behaviour that I've sort of seen, the platform behaviour.
SPEAKER_01Well, well, humans are a tribalistic in nature, they always have been. You know, when I when I was gr growing up at school, you know, the neighbouring towns high school were the enemy. But then if uh I lived in the country and if people from the city turned up, they were the enemy. And then if someone from a different county turned up, we would join together as a county, right? So so if someone from a different country turned up, you joined together nationally. So we are nationalistic, xenophobic people. And it's been a privilege in my life that I've managed to travel the globe and meet people from different backgrounds and realise well they're not that different. We're all humans with, you know, human human values. And the internet enables us to sit at home, you know, and there is this kind of conver conversation like, why would I talk to my next door neighbor when I can speak to someone across the other side of the globe who shares my interests? My next-door neighbor might not. But then what that leads to is the fact that we don't expose ourselves to other points of view. We become tribalistic, we become xenophobic, we the internet allows us to like people who are like us and block people who aren't. And actually, one of my best friends has actually got very opposite political views than me. But we argue respectfully very much, and I like to think we bring each other closer to the center. So I actually don't so I actually think that once you get to know the oppos the opposition, if you like, it becomes harder to other them. So I'm a big big fan of face-to-face contact, I'm a big, big fan of in-person, because once we get to know and once we meet people, once we can't just shut that door electronically, once we do expose ourselves to views that we might find abhorrent, then actually we can probably achieve a great level of understanding and consensus and empathy with our fellow humans rather than just living our own little echo chambers.
SPEAKER_00Yeah, I agree with that. Amazing. So you often speak about preventing data harms rather than just avoiding fines. What is this most significant hidden harm businesses aren't paying enough attention to right now?
SPEAKER_01Well, fines are generally a terrible reason to do anything. I mean, um, you're unlikely to motivate someone with blackmail. Do this or bad things will happen. You know, and even if they do do it, they've got to walk into it begrudgingly with like one uh arm tied behind their back and go, I'm doing this, it's horrible. Law says I have to. So I'm a big I've always been a carrot rather than a stick fan. Our own mindset, as we say, as human, running before we can walk, and uh the biggest harm, and I'm not even sure it's a hidden harm, is is again, you know, we we we rush to adoption. You know, I I am a uh an early adopter of new techs, if only to test them out. Um I was in a business the other day where the management had set a goal that 60% of their processes were going to be on AI. And everyone was running about trying to use AI, which was a you know solution in such a problem, right? They you no one had even done the research. Like you're 60% capable of being automated. Is that an appropriate solution? Well, our customers want this, right? So, you know, we've always got this. Let's get ahead, let's adopt, let's move forwards. Um people will adopt new and sexy technologies without the prior forethought that's needed to adopt them in a more responsible way. And unfortunately, that means that harms happen as a result, whether poorly intentioned or or or or or the or the benefits was really well intended, right? You know, you don't have to have intent to cause harm, right? You know, you can you can be well-meaning and still have a bad outcome. Um, and I think that's the biggest harm I'm seeing. You know, the problems are bigger, the harms are bigger, we move wholesale in without the proper forethought and research and governance. Now, I don't think that's got to be solved with regulatory penalties. I don't think, you know, and I do believe if technology is a problem, sometimes technology can be the solution. But at the same time, you've props hit the nail in the head. It's the people, it's the humans, it's the um, it's our own cognitive weaknesses, well weaknesses and the ability to, you know, uh move forward with solutions without the proper thinking. I'm not even sure that's a hidden harm. I just think that that's who we are as a species.
SPEAKER_00Yeah, and I think it's you know, some of the things for my you know, National Health Days is that if I didn't have a good relationship with somebody, there was no way I was gonna get anything across the line. Um, you know, and it's it is about building those relationships, it's about showing that influence within an organization. You know, I'm also a strong believer that, you know, information leaders, privacy officers, data protection officers, they're they are entrepreneurs, so they work in businesses, like they, you know, an entrepreneur and entrepreneur, they're exactly the same thing. An entrepreneur just has the financial risk attached to it, right? Um, there's some really good discussions from Alice and Edgar around around this topic. But actually, like we need to influence those those organizations to actually say we're leading privacy, we're leading records management within the business, and this is how we do it, and this is how we show up, and this is how we influence. And and it's great to have, you know, fines as an outcome, because if we don't do stuff, but I don't think it should be the driver that is driving all the, you know, the technology, the AI, the sort of compliance, but it's also the people aspect that needs to go in through each and every business.
SPEAKER_01Putting the human at the heart of what we do.
SPEAKER_00A hundred percent.
SPEAKER_01Yeah, and and uh and I love the fact you keep using the word influencer. I mean it's a it's it's a much maligned word in the world of online influencer. But actually, you know, we do have a personal brand, we do carry ourselves where we go, and we have choices about the sort of persons we want to be and the responsibility that we take and the people around us we have an effect on. So we need to take responsibility for that and hopefully um uh you're a parent, I'm a parent, you know, produce people out there in the world who want to make a a positive outcome to the people that surround them.
SPEAKER_00Exactly, because you're not gonna it's it's even when you would like buy a product, right? You go into a store, you buy a product, and if a customer sales representative isn't very kind to you, you're like, well, why would I part with my money? It's the same thing. Like, why would somebody engage with me to get me to do extra work for information asset registers, for example, um, if they're not sort of forthcoming, if they're not explaining things, if they're not sitting down, if they're not taking the time with me and and actually saying if we do this, we can achieve great things together, and these are the benefits for you. Like, that's all conversations, like, and I'm a strong believer that like 40, maybe 60% of um information leaders' jobs should be around relationships and communication and how you can drive those things through the business.
SPEAKER_01We're pretty good at knowledge in the data protection world, insanely good at knowledge. But what we're perhaps not good at is communication skills. What we perhaps need to be better of is the softer skills. Um I was looking at DPIA the other day, and it was really interesting that that they'd gone to all this trouble to do this almost hundred-page DPIA. And then I said to someone, well, what's the outcome? And they said, What do you mean? We've done the DPIA. And I said, No, no, the DPIA is not the outcome. The DPIA is an artifact of evidence, but actually what the DPI should be doing is protecting people from harm. You know, you know, so so what's the action plan? What's the outcome? Who's going to do something as a result of this document? And they said, Well, I've done the DPIA. And I said, No, no, the DPIA is not the end, it's the means. So who are we talking to? Who's taking the next steps? Who now needs to do something apart from this DPIA? This is not something that should just be filed and put on the shelf in case someone wants to look at it. This is something that should mean something and do something. And that means that you have to go out there and make sure that the people around you are working with you and for you. Um, because otherwise, you've just wasted everybody's time, including your own.
SPEAKER_00100%. And and time, I always say time is a golden nugget. And actually, most of us are doing jobs we shouldn't be doing, we're doing other people's jobs. So it's that time management thing as well. If we spent more time doing the stuff that we should be doing. Like influencing organisations and getting stuff over the line that we'd actually get what we need to be done without that sort of um burnout rate. So last question for me, Ralph. If you could give one piece of advice to a CEO who currently feels trapped in a data protection dungeon, what would that be?
SPEAKER_01Well, at the risk of being self-promoting, give me a call. Uh no, um, you know, um available at reasonable rates, as you say. No, um, we're not bad guides, you know. We we've talked about the fact that we do need to promote ourselves better and and perhaps communicate better. But, you know, if we're talking about the metaphor of your dungeon and your RPG, you know, let's face it, all of the good fantasy stories need a guide or a friendly wizard to put them on the right path, right? Somebody who is going to be the source of wisdom and and who can ultimately let them go on to be the heroes of their own story, but um at the same time be that comforting sideline blanket and that uh and that that element there which can help put them on the path to their own adventure and their own glory and who can help them to maximize the benefits and minimize the harm. I mean, it can be win-win. It's not profit versus privacy, it's it's profit through good data protection practices at the end of the day. Um, so you know, the more we can engender trust and the more we can get out there and actually make sure we critically think and actually have good people around us who can enable us to complete our mission rather than get in the way of that mission, the better off we'll be.
SPEAKER_00Thank you so much for your time. It's been incredible. How can listeners reach out to you, Ralph, if they want to know any more?
SPEAKER_01Well, I I think I mentioned the new website earlier, seriousprivacy.co.uk, or or or or just Ralph at seriousprivacy.co.uk, or reach out on LinkedIn or Blue Sky at IGRO'Brien, um, listen to the Sirious Privacy Podcast. Um, and and of course, uh I'll be on Leadership Fruit Ada courses as well in the future.
SPEAKER_00Amazing. Cheers so much, Ralph. Thank you.
SPEAKER_01Pleasure is all mine.
SPEAKER_00Thank you for listening to the journals of the Information Entrepreneur with me, Jacqueline Stockwell. I hope you found this episode inspiring and helpful and have some takeaway tips that can be useful to you. If you liked this episode, please like, review, and share it with your friends. Your support helps us reach more information leaders to stay inspired and listen to great content. Want to test out your strengths and weaknesses and measure it against our empowered framework? Please complete the scorecard. It's a great way to improve and evaluate your skills. You can find the scorecard at the end of the description of this podcast. Stay tuned for a new podcast every Thursday and remember to be bold, be brave, and be beautiful.