SNIA Experts on Data

Data Protection: Insights on Privacy, Security, and Regulations

SNIA Episode 19

The complex world of data protection encompasses key elements, including data integrity, availability, and privacy. Hear our SNIA experts discuss the implications of ensuring data is secure, protected and private as they discuss emerging regulations like GDPR, the differentiation between data security and privacy, and the vital importance of proper data management throughout its lifecycle. You'll hear: 

• Definition of data protection and its multiple elements 
• The impact of GDPR on data handling practices 
• Differences between privacy and security explained 
• Challenges presented by new storage technologies 
• Importance of lifecycle management in data protection 
• Emphasis on collaboration among tech, legal, and regulatory bodies

SNIA is an industry organization that develops global standards and delivers vendor-neutral education on technologies related to data. In these interviews, SNIA experts on data cover a wide range of topics on both established and emerging technologies.

About SNIA:

Speaker 1:

All right, welcome everybody to the SNEA Experts on Data podcast. Super excited because we've got an amazing panel of experts and fantastic humans that I'm joining today to have a great discussion around the experts on data when it comes to practice areas in SNEA. So we've brought a lot of neat things around focus areas and today we're going to talk about protect and you're probably going to say what does that mean? That's why I've got these amazing folks with me and oh, by the way, my name is Eric Wright. I'm the Chief Content Officer and co-founder of GTM Delta and also the lucky host of the SNEA Experts on Data podcast and with that I'm going to take us around the room and introduce the incredible panel. We'll start with Michael. Do you want to give a quick intro yourself? And then we'll get rolling through the crowd.

Speaker 2:

Thank you, eric. My name is Michael Dexter. I'm an independent support provider in Portland Oregon. I'm very active in open source circles and I host three calls a week one on containers, one on OpenZFS and one on hypervisors.

Speaker 1:

Well, I'm glad we could sneak in between those. Thank you very much for joining Eric Hibbard. Thank you Not just because you're amazing, because you have a great first name, but there's definitely a lot more that I'm a real big fan of with you, Eric.

Speaker 3:

Thank you, eric. So I actually in SNEA chair the security technical work group. My day job is with Samsung Semiconductor and I deal with NAND and SSDs and very active in the security and privacy standardization activities IEEE, iso, things of that nature no-transcript, and mostly I'm a senior manager with Ernst Young.

Speaker 4:

I've been in the industry for more than 35 years, all around data protection, data retention and data archival. I have held a lot of disaster recovery, business continuity and ransomware protection across my career and I'd be happy to be a member of this discussion panel.

Speaker 1:

Fantastic and, John, I can say that you love Keoxia so much that it's your middle name. I love your tag that I get to see on here. So, John, if you want to do a quick introduction, then we're going to jump into the chat.

Speaker 5:

Sure. So hi, I'm John Geldman. I'm employed by Keoxia. I get to lead their standards activity in general. It's fair to say I am a standards geek. I am also on the SNEA board and active in most committees you can name, including a lot that Eric runs.

Speaker 1:

Now the fun part is that we get to stay on you for a second John, because one of the things we have with the practice areas is really understanding what does it mean? You know, the data focus areas are curious to folks from the outside, so let's kind of start fresh start. No one knows who we are. Let's talk SNEA and the protect data focus area.

Speaker 5:

All right. So today I'm here to introduce this talk on one of SNEA's focus areas to secure and protect data. I'd like to call out two of the active groups that SNEA has focusing on this area the data protection and privacy, community marketing and educational materials. And the storage security technical work group that generates standards, provides comments on various domestic and international standards, provides support to other SNEA technical work groups and works with many of SNEA's alliance partners. By the way, I also co-chair that group Now the SNEA online dictionary, which you haven't seen. It is a great tool.

Speaker 5:

Describes data protection as the combination of data integrity, data availability and confidentiality. I'm going to throw in data privacy as the fourth element. That's, ensuring that data has not been altered, ensuring that data is available even if there are physical failures, ensuring that data is only accessible by authorized parties, and what does it actually mean for a person to have control over their personal data? Each of these elements is complex, multiple aspects approaches. Each of these four elements are affected by ongoing development of media, new data, organizational frameworks and also by the growing capabilities of quantum computing. That affects our cryptography. Each of those elements is supported by a combination of cryptographic systems and permission systems and data redundancies and regulations For system and developers.

Speaker 5:

We use all of that stuff, except usually, the regulations. Now the lawyers and legislators. That's different. There are no laws on the books that have consequences if you aren't doing the right things in these areas, if your company is responsible for data breaches or doesn't appropriately provide control over personal data. Now that I've got that scary thought going, I'll turn this over. Now that I've got that scary thought going, I'll turn this over.

Speaker 1:

Nothing like opening on a bright note, and it's funny that you did highlight something that I think we're probably going to spend a good time digging down into across the whole panel is the difference between sort of technical capabilities, the implications of that as an operational challenge, and then the third pillar being the regulatory challenge that comes along with it, and that is, you know, regional as well. So you talked about some of the fun part of as lawmakers enter the conversation, and maybe let's kind of start as we look at defining data protection. Let's kind of start as we look at defining data protection. You were looking at how the US focuses on it versus.

Speaker 1:

You know there's not a person in this audience who likely hasn't heard the phrase. You know the letters. Gdpr does go all the way down to the metal because there are capabilities that we're having from the very, very lowest possible, you know, viewpoint that will traverse to the point where they are, you know, bound by regulations and excitement. So, uh, who wants to? I'm going to open this up and see who gets the fastest to the mic on this one, but but let's talk about, you know, that sort of market definition and why it's regionally different.

Speaker 3:

So I can probably take that one, since I do quite a bit of work in the privacy space. Yeah, you're right, it does get all the way down to the metal, so to speak. And you can use the GDPR, since you invoke the well-known acronym Because something as quote simple as destruction of data could constitute a data breach. And that was one of the big sort of eye opening things for a lot of folks, especially in the in the storage industry, because I think up until that point, we always thought of a data breach as unauthorized access, but that, in fact, is not the case. Corruption and destruction can get you in just as much hot water as unauthorized access. That said, gdpr was a sort of second move by the European Union to try and get this right. They've since been followed by a bunch of different jurisdictions, so places like China, for example, have very, very strict laws in this space.

Speaker 3:

In the US, we're a bit schizophrenic when it comes to privacy regulations At this point. I believe every state has something on the books and they vary significantly. California, where I think John and I are at, has some of the more stringent requirements, but you know, massachusetts has got its own sort of version. None of them are tracking, like GDPR. At the moment, the focus is a little bit different and you know we don't get too worried about things like IP addresses and things like that, where the Europeans are very fussy. Even photographs and whatnot are, you know, protected information in certain settings, so we haven't crossed over into that space. It's unclear that we will see any serious privacy regulation in the US. If I had to speculate, it would probably take a path that would preempt some of the more stringent that are on the books in some of the states.

Speaker 1:

So yeah, I think the chaos factor we've got in the us is it's going to continue I can say it coming from, uh, I I'm canadian by birth, as my text and accent always gives away here, but the I worked for for, let's just say, a large uh large insurance company. That rhymes with fun life. Anybody, anybody who goes back to LinkedIn, will easily tell where I was from. But I was really exposed early to the idea that it's not just you know, as a company you have requirements, but it was as a company per location because there were provincial regulations around the types of data we can hold and maintain. And then it became country to country, of course, because it's a global company maintain. And then it became country to country, of course, because it's a global company and I I did get that, excuse me, early exposure to it and, like you said I think, eric, you brought it up was the idea that we think of it as exfiltration, like the removal and and selling or publication of data is what people see is like the, the sacrosanct, the ultimate, ultimate thing.

Speaker 1:

But in fact just possessing and holding the data still has an incredible amount of regulatory stuff attached to it and when we think of like right to be forgotten. I think was one of the really interesting parts of legislation that came out of the EU and that introduces a whole exciting thing that takes us down to that. So maybe you know on that, what does it mean? You know, as a standards development body, when we see something like that come from the regulatory side, how do we then parlay that down through the rest of our organizations and how we handle it?

Speaker 3:

and how we handle it. Well, I mean somebody who's heavily involved in like ISO standards. We don't cross over into the regulatory space. In fact, this is something that we have to be extremely careful about because you know like in the case of some of the work in, you know, the ISO Subcommittee 27, which deals with security and privacy, you know that's a little more expanded than that we have, on the order of 75 national bodies represented. So one size would not fit all from a regulatory perspective. So the standards tend to dodge that issue. The twist is that it's not uncommon for regulations to actually cite certain standards. So there is a relationship, but in general the standards try to avoid, kind of getting.

Speaker 5:

I'll note that I would tend to agree with Eric. We don't really touch them in the standards, but I know for what I do. I get called for advice from legal about export rules from different places.

Speaker 1:

We need to put our product specifications, not our standards, out, that's actually part of what we need to control carefully where we're actually exporting trade secret information. Remember the days when even just running the wrong version of Netscape could actually get you in trouble. Remember there was a 40-bit and a 128-bit version. Remember there was a 40-bit and a 128-bit version. Like it's kind of wild that we take for granted how easy it is to just grab software and put it down on our systems these days. And then, when it comes to, you know, data protection in general and again we'd sort of take it for granted that what we assume when we hear data protection it means backup, but that's actually like one sliver of it. So maybe on that, what is the difference between security, privacy and the storage? You know elements of this protect data focus area elements of this protect data focus area.

Speaker 4:

Maybe I can step in here. Go for it, muneer. Yeah, the idea here is, when you talk about data protection in general, our thoughts within the infrastructure resources is that it's protection from data loss, data corruption, data unavailability, by providing what you just said, eric, the backup as an example, maybe replication, maybe cloning, maybe snapshotting. All this is to protect the physical capability of the data, to protect the data from prying eyes. That is data privacy. So we always tend to mix. As Eric Hebert just said, the line separating those two items is very blurred, so it's very important for the end user to understand what is he protecting from. Like, for example, if your data is on your local device that is not connected to anything, so it is unlikely to be accessible to praying eyes, this is protected from the privacy point of view, but that does never protect you from hardware failure or data loss. So these are the barriers, or the separations between the data privacy concept and the data protection concept.

Speaker 3:

I might put a slightly different spin on that. Different spin on that because you know so, privacy, you know as a practicing privacy professional there's a legalist. So I would say what Muneer described is probably more from a confidentiality perspective. Privacy has got a legal regime, a regulatory regime that, and to use as example the fact that you might have certain data on that laptop might actually be a privacy violation, but it wouldn't be a confidentiality issue because it's not connected and so understanding I mean that may sound like a subtle difference but it's not you may not be authorized to have anything to do with certain data that's sitting on your laptop.

Speaker 3:

So privacy has a dependency on some of the security aspects and you said earlier availability, integrity, confidentiality, said it the wrong way, I should have said the CIA sort of wording. I said it the wrong way, I should have said the CIA sort of wording. But security itself doesn't actually care about privacy from a regulatory perspective and that's, you know, one of the interesting challenges. Coming back to Muneer's comment about data protection through kind of the lens of storage, about data protection through kind of the lens of storage, there is from a security perspective there's kind of an assumption that you're doing the storage side of things correctly, because there are concerns about like business continuity and things of that nature. So there's, you know a little bit of. So. Privacy has some dependencies on sort of the security community. The security community kind of has some dependencies on sort of the storage-oriented aspects of data protection. But it's subtle in many cases you know how strong that dependency actually is.

Speaker 1:

Yeah, that's an interesting thing you brought up, eric, is this idea that it's almost like kinetic and potential, as we talk about in the sense of energy, in that the data you have on your machine has the potential to be something. It's not necessarily actively at a point where it's at risk of exploit, but just the fact that data is possessed on that drive introduces a regulatory, could be a regulatory issue. Yeah, and that's why it's such a careful dance. I thought I just had to call my storage team up. I didn't realize that I had to call legal every time I need to ask one of these questions. But that almost is this interesting boundary and, like I'm going to come into you for the hot take, there's got to be something. How do you explain this when you're talking to your community and what's the angle at which you look at this from this security, privacy and protection?

Speaker 5:

So one of the key aspects of privacy is it really is about somebody's personal information and control of it, and so if it doesn't have to do with that person, then it's not a privacy concern with that person. So it's the sharing of that information and whether it's still there or not or whether it is allowed to be shared in other places, which is important. When we talk about confidentiality in general, it's a more really general thing about are only the people who are supposed to have access to that having access to that? So if the company has decided that it's okay to have a database of names and phone numbers and everybody has access to it, then we need to make sure that only the people who are supposed to have that have that.

Speaker 3:

Except that's a good example, John, in that I suddenly forced to deal with that to comply with GDPR, and in many cases, people would split up their call centers so that they would be in certain regions or they would say, yeah, we can't, you know, we can't keep this. Oh look, we got 25 years worth of records.

Speaker 3:

Why do we have this? There was a lot of soul searching, that kind of went on. But yeah, I think more generically, eric, to a comment that you kind of healthcare information stored away, scrolled away somewhere. If it goes into force now, everybody's got to scramble to figure out. What do we actually have on you know?

Speaker 1:

what do we get?

Speaker 3:

stored, because now it's suddenly you know it's now regulated and you've got to basically deal with it. This happens all the time with you know the regulatory space. As John pointed out, personal information is a little easier to track because you know what to chase, but we're seeing regulations pop up. You know, california a while back had an IoT-oriented regulation where, all of a sudden, you know, am I an IoT company? And it depended on the definition, which was basically everything but routers.

Speaker 3:

You know so we get hit with these things where we've got to sort of skulk through the data to figure out what we have and how do we have to protect it.

Speaker 1:

And it's such a tough challenge too, as technologists, because we live in this intermediary space where we have to understand the regulatory impact, but then we also have to relate to the technical capabilities, and that was a great example where, yeah, unless it was bolted into a 42U rack, it was considered to be mobile, you know like that, and that introduces an interesting challenge. Now I'm going to switch the question I brought up to you, Michael, before, because I I want to go. Let's talk about bottom up what's your, as you look at, describing what are the capabilities that we're standardizing, that attend to some of these challenges that are being brought up as we're talking about at this regulatory and sort of top level layer?

Speaker 2:

I'm glad you mentioned that, because you used a phrase like I need to talk to legal every time I do something. You used a phrase like I need to talk to legal every time I do something. Well, while we have countless storage products out there, the average user, be it small and medium organizations and even rather large ones, do not have storage engineers. And I'm glad SNEA has the storage developer event, because there are very few true storage developers, and so, at best, we are handing smaller organizations weapons that are unsafe at any speed, and I'd hope that organizations like SNEA can help bring them these notions that when you are setting up storage and this was an eye opener for me thank you, eric Hibbert that the destruction of data is just as important as the protection of data, because your average storage engineer will hold on to that data until the end, and that's their job, and they are failing at their job if you need to destroy that. So, bottom up, I think we need tools that are available to all. I think we need tools that are available to all.

Speaker 2:

I personally believe those lie in open source tools, because the vendors are inconsistent at their delivery of data protection solutions, and it's a challenge. There's no question, it's a challenge, be it those competing interests of protecting, yet carefully deleting, and, as you've mentioned, the sudden HIPAA or GDPR requirements. Untangling a rock-solid archive of data can be a nightmare. And so far as if we've succeeded as storage engineers in providing multi-tiered, decades-ready storage, suddenly someone one user out of a thousand has the right to be forgotten. How do we take that solid't touch it store and extract that one user? So the challenges are palpable. Every effort is appreciated and I certainly hope Sneha can play a role in all that. I hope that answers your question to some degree.

Speaker 1:

That's great, yeah, and I think this, actually, we're going to go to the deepest possible complex thing that has now come to the fore, which is, you know, vector databases being a way in which we store data that's not easy to remove. You know, and this is the problem We've got people are training models, we're storing models that contain a vast amount of information. We struggle with explainability and and the reason is because it goes down to this idea of like protecting, destruction and life cycle of that data how do we make sure that we can safely and even potentially remove some of that stuff? Now we're not solving the vector database problem. I pulled as one example, but you know what are the artifacts that we're dealing with at the metal and up, that are being leveraged as we see new tech come in. That's going to challenge the methodology.

Speaker 2:

I'll throw in one key point there that I always come back to in our meetings with DPPC. It's that when there's something new, be it a vector database or something exciting, generally there's a question well, is it faster, more reliable or more secure or cheaper? And so it's still a storage technology. Let's find where it fits in that mix. But yeah, there's always something new and shiny, and there always will be. So that's my take on it.

Speaker 3:

Yeah, and part of that sort of following up on his summary there. From the security piece we typically ask for what happens when you're all done with it. So it's not so much. You know, you could actually have the security problem solved while it's operational in use, but if you haven't thought through what does it mean when I'm all done, then you may have actually exposed yourself to a lot of data breach scenarios at a time, because if you're getting rid of something or you're shutting operations down, you surely don't want to be dealing with something like this.

Speaker 3:

So you know, this is a facet that is very important, especially in the storage arena, because, you know, we are the guardians of the data and, as I think everybody is aware, we go to great lengths to squirrel it away in lots of places to ensure it never disappears. But when it's time to make it disappear, we've got to basically deal with that problem.

Speaker 1:

I had. The strangest funny thing that happened years ago is I remember that we had, like you know, workstation style servers. This is back in the early 90s I'm an older fella, so I've been around since some of the early implementations and we had a bunch of spare servers and so what? They would come into production, we would take their drives away and move them. So a shelf of servers, shelf of storage, so all these disks, you know know, gigantic, 256 gigabyte. You know drives, massive, you know.

Speaker 1:

But I would take, I took drives. I'm like, oh, there's a spare server and I just took five disks, popped it into my nice little tower and I powered it up and it took about three or four minutes and it came up to a Windows screen. I was like, oh my God, I literally picked a stack of drives that were still had controller memory and disk side memory and it recovered the array and I just happened to have picked five that were in order. So that was a scary moment where I was like, okay, what just happened? Here? I mentally installed Windows NT. I don't know how this works.

Speaker 4:

Actually, this problem now is very common across the Internet, the hyperscalers and the cloud providers, where a company or an individual acquires a storage, let's say for a project, and puts some test data or dev data on it to do his development and qualification and certification of the project and then, once he's done, he releases the resource to the cloud provider again to stop billing without removing the data. The next person who picks that data he's going to be having a lot of fun going through PII data that he can use for multiple purposes. So that's a concern that's always existing, to be true.

Speaker 5:

Now, personally I'm a fan of the sanitize command, but I helped invent it. So there you go. But that's a command which is used in hardware devices to really reset everything that's there. Really reset everything, that's there. One thing that I'll share is coming from NVMe, but I won't give too much status about it is the ability to purge a namespace, which will be very helpful for logic people and for logical storage in the future. But that's not available yet. It's coming.

Speaker 5:

It's a lot easier to deal with these logical constructs than it is with the physical constructs.

Speaker 3:

And John said something actually that's worth pouncing on here.

Speaker 3:

He used the term purge and that in the security world has well hopefully has some special meaning to you, because that's essentially a technique that is supposed to hold up to, potentially, a nation state trying to do a recovery. It should definitely be anti-forensic. So I mean, if you're using the HRS techniques that have been implemented correctly you know forensics people should be very unhappy if they're trying to recover data. There should be no way from getting it on the data itself. Now, forensic people are very creative and very creative, and systems are notorious for temp, space and again, squirreling data away in a bunch of different locations, and so they may be able to recover it that way, but if you just hand them a drive that's been, you know, purge operations been executed there should be nothing available to them or a nation state if done right.

Speaker 1:

Yeah, and it's to the point where you know, on the physical side, you know, we used to just literally degauss drives, which was always fun to watch them make little dances for us on these little exciting magnets and heat up. But then, yeah, as we move to the cloud, we move to distributed shared storage environments. We don't have the degaussing option and it's that real understanding of lifecycle goes to the point of termination of the data and we just easily forget. And we even talk about backup products. They're not backup products, they're restore products. But we always talk about it in the context of you first have to back it up. But time and time again it's like can it be restoredTO? Whether it's going to be sitting in cold but live storage, whether it's going to be sitting in hot storage in a secondary site. There's so many ways in which we can manage how that data lives and, ultimately, where it dies.

Speaker 3:

Yeah, and just you know, another angle that we're having to deal with it's related to the standardization is when you consider sustainability. You know, going out and destroying drives that are potentially reusable is really becoming less acceptable. I don't think we've sort of quite crossed that line yet, but there are a lot of people that are looking at that situation of like. So how, how do we eradicate the data on the drive but leave the drive in a state that it's it's usable? And this is back to that term that John used earlier. You know purge and, in particular, the using cryptographic erase, which is essentially, you know, using encryption and certain key management techniques, as in, like, losing the key intentionally. There's some other conditions that you have to worry about. So not only are we having to worry about eradicating the data, but now we're also having to look at a situation of doing it in a way where we're not causing physical destruction to the underlying storage device or media, and it's clear to the storage industry that we've got to deal with this.

Speaker 3:

We see SNIA is operating in this space for quite some time. The Green Stories Twig has worked with the EPA on a. You know a lot of technologies. You know in the past, we see the Open Compute Project has, you know, got an entire sustainability activity. So this is something that you know. The information communications technology industry is basically trying to figure out how to sort of step up and deal with some of this trying to figure out how to sort of step up and deal with some of this.

Speaker 1:

Yeah, and that is where we often feel like we're competing interests, because you know best way to destroy it would be, you know, burn it with fire, throw it into a volcano, but obviously the impact of doing such is terrifyingly bad for so many other reasons. So we have to think about recycling, but destruction at the data layer, with absolute belief and proof that this data is irrecoverable.

Speaker 4:

Yes.

Speaker 5:

One of the fun things.

Speaker 2:

Go ahead, john, you go you haven't gone yet.

Speaker 2:

There is indeed a secondhand market, hand market, and in this broader education of okay, how do we properly sanitize? A consumer sure appreciates knowing the number of hours on a drive, the smart status, the health of that drive and whatever they can about the history. And there are vendors who are somewhat famous for fudging firmware and somehow having terabytes of data through a drive but very few hours, which is completely implausible. And so this whole ecosystem is indeed a challenge, and that hasn't quite been solved to our broader satisfaction. Go ahead.

Speaker 5:

So two interesting things. One is destruction isn't as easy as it used to be. One is destruction isn't as easy as it used to be when we can put multiple encyclopedia Britannicas on an eighth of my pinky nail. Then all of a sudden, if you're trying to shred this, you just can't. You can't get it down to something which is meaningless to a state organization. You have to effectively phase change it. You have to make it go from solid to liquid or something else. You have to melt it. You have to do something to really change it. Now, to erase is actually one of the easier ways to actually get things done If your hardware supports it and can support it. And when we're talking about where we're holding data, you know if we're actually holding data. You know if we're actually holding data which is now. The data is a database or in an FPGA. We've got a different story on our hands, but we still have this potentially confidentiality and privacy concerns to worry about to worry about this is.

Speaker 1:

I think it reminds me that such a perfect conversation, and I only wish we had like another hour to be able to go, because there's so much we could dive into. But I think I'll say thematically. I want to wrap by just saying the most important reason why what we're doing in SNEA and collecting amazing folks, like everybody I've got here is the fact that we are going to come out of here, as we should and out of every architecture discussion with a series of questions, not answers. Obviously, we drive toward answers, but the best thing you can have is are all the questions being asked, and I would say that this is probably the number one thing why I recommend people get involved with SNEA, because you're surrounded by other technologists who are experiencing the same problem in parallel with you and we're all solving it together. So the pace of acceleration of innovation so much better in this broader tech community and inevitably, you know, let's not reinvent the wheel and let's certainly not destroy the wheel in a way that doesn't, you know, protect us from PII being on the wheel. You know there are so many things that we could do, but I want to say thank you to all the amazing folks.

Speaker 1:

This has been a fantastic conversation. Like I said, I just there's so many thousand questions I want to ask. I want to have so much fun with everybody, but you've all been great. So, for just quick round, what's the best way that folks can reach you? We're going to start with you, eric Hibbert. What's the best way if people want to get caught up with you?

Speaker 3:

Other than, of course, through directly meeting up at SDC and the C events.

Speaker 1:

Sure Well, yeah, I'm on email erichibbertbert at Samsungcom, or I'm pretty easy to track down on LinkedIn, munir again, I recommend people check it out. Munir and I had a fantastic conversation in the past, so do check out some of the other amazing podcasts as well as with other folks that have been on here. So, munir, how do we get ahold of you if we want to catch up after the fact?

Speaker 4:

I think the fastest way to access me and potentially everybody on this call, is LinkedIn, so our profile is open for anybody who wants to get information or connect. I'll be happy to respond to any inquiry following on.

Speaker 1:

Excellent and Michael.

Speaker 2:

So by email editor at callfortestingorg and on the Fediverse Dexter at bsdnetwork Nice.

Speaker 1:

And John. Last but very certainly not least, thank you again for everything you're doing around keeping the wheels on the bus, the hamsters spinning inside wheels and all the amazing stuff you're doing with SNEA and keeping all of us really having these great conversations. What's the best way we can get ahold of you if we want to reach you again?

Speaker 5:

First, I'll have to admit that, whatever it is, my boss would like to know, but I can be. I can be reached at johngeldman, at keoxiacom, and I also have a LinkedIn area where I can be reached.

Speaker 1:

Fantastic, yeah, and there's a great white paper that's coming up. We're going to be seeing that. As soon as it becomes available, we'll probably be able to share that around the data protection white paper. There's tons of great content that goes along with these conversations. So folks do check out, follow the podcast, smash that like button and subscribe and do all those things that the Zoomers tell us we're supposed to do, but, more than anything, connect with these amazing folks. Thank you all for sharing the time with me today.