"The Data Diva" Talks Privacy Podcast

The Data Diva E268 - Anuj Jain and Debbie Reynolds

Season 6 Episode 268

Send us a text

In Episode 268 of The Data Diva Talks Privacy Podcast, Debbie Reynolds, The Data Diva, talks with Anuj Jain, Lead Privacy Engineer at S&P Global. They discuss the role of privacy engineering in shifting data stewardship earlier in the technology lifecycle and how privacy-focused design strengthens both compliance and innovation. Anuj explains why many organizations still view privacy narrowly through a legal or security lens and why technical privacy practices are essential for building sustainable, enterprise wide maturity.

The conversation explores how privacy engineering transforms real operational workflows, including the review of cookies and tracking technology, automation of assessments, governance of AI systems, and managing risk through proactive testing and technical controls. Anuj provides insight into how S&P Global structures its privacy program across legal, technology, and business teams, creating a model of cross functional collaboration that allows privacy to scale.

Debbie and Anuj also discuss the cultural dimensions of privacy and how expectations differ across regions such as India, Europe, and the United States. They examine the impact of consumer awareness, regulatory timelines, and local norms on how individuals and companies interpret privacy. Anuj closes with practical insights about the power individuals have to influence corporate behavior through their choices and questions, and why thoughtful human decision making remains essential even in an AI driven world.

Support the show

Become an insider, join Data Diva Confidential for data strategy and data privacy insights delivered to your inbox.


💡 Receive expert briefings, practical guidance, and exclusive resources designed for leaders shaping the future of data and AI.


👉 Join here:
http://bit.ly/3Jb8S5p

Debbie Reynolds Consulting, LLC



[00:00] Debbie Reynolds: The personal views expressed by our podcast guests are their own and are not legal advice or official statements by their organizations.

[00:12] Hello, my name is Debbie Reynolds. They call me the Data Diva. This is the Data Diva Talks privacy podcast where we discuss data privacy issues with industry leaders around the world with information that businesses need to know.

[00:26] Now I have a very special guest all the way from India,

[00:30] Anuj Jain.

[00:31] He is the lead privacy engineer at S and P Global. Welcome.

[00:37] Anuj Jain: Hi, Debbie. Thank you. Thanks for having me.

[00:40] Debbie Reynolds: Well, I'm very excited to have you on the show.

[00:43] I actually had done a speaking thing with S and P Global a couple years ago, so I really enjoyed collaborating with the organization. But I am a data person, so people who do privacy engineering on the data side very much interest me.

[01:00] So you and I are connected you on LinkedIn. I said, hey, why don't you come on the show? And you're like, sure.

[01:07] So happy to have you here.

[01:09] Anuj Jain: So happy to be here. So, you know, when you did that talk in snp, I,

[01:13] you know, did. I was there know I was listening to you on the other side.

[01:17] Debbie Reynolds: Yeah.

[01:18] Anuj Jain: And that was great.

[01:20] Debbie Reynolds: That's so cool. So cool. That's so cool. I didn't know that. I didn't know you were in the audience.

[01:27] Well, tell me about your journey. I think to me, privacy fascinates me because I feel like everyone has a very unique story and how they got into it, Right. So it's not like you went to a university and all of a sudden like you're a privacy engineer.

[01:44] But tell me your journey in privacy engineering at S and P Global.

[01:49] Anuj Jain: Right. So when I was doing my engineering from the university, when I was doing my graduation,

[01:55] so I was doing the computer science. Computer engineering.

[01:59] But I did got to know the end of the, my final year that I don't want to be a developer.

[02:05] Right. But the job that I got was for the developer.

[02:09] So there was a gap between my joining to that job and I had some free time.

[02:15] So there was an offer, there was an internship, one of the big fours, Deloitte, where I started my journey for this privacy role.

[02:22] And I just found that this is something interesting that I can try at least. And if I didn't like that, I can then again go back to being the developer.

[02:30] So that the three months period that I had, I just joined Deloitte for this privacy role. And since then I really enjoy that work. It really find me fascinated in terms of, I think it was 2018 and GDPR just kicked in.

[02:43] So there are a lot of buzz it was like a new growing thing that many people didn't know about. And I really like that in terms of exactly what privacy is, it's in terms of the future it had in terms of my career as well.

[02:57] So just I, I just kept going with Deloitte for three years. Then I moved to different roles.

[03:02] It was a cons. I started consulting earlier, you know, with Deloitte and other firms.

[03:07] So I was helping companies,

[03:09] you know, being compliant to gdpr. But I was also so fascinated by technology.

[03:14] So I had in mind, you know, I have to do something related to privacy and technology.

[03:18] And after a few years I moved to the implementation side of privacy, you know, for the industry.

[03:24] First there was an Indonesian company I moved to where I helped them set up their cookie banners,

[03:30] Privacy Nancy Technologies and all that.

[03:32] Then after that, then finally I came to S and P as a privacy engineer where I'm leading that role and you know, doing many things with AI integrations, automating stuff and all that.

[03:44] So yeah, I mean that's my journey and I just came to privacy just, you know, because of not I didn't want to be a developer.

[03:53] Debbie Reynolds: Well, it was a good choice. It seems like it fits. It fits very well with your skill and then also just your interest in technology.

[04:02] I would love to know what do companies get wrong or what is kind of some of their biggest misunderstanding about privacy and data.

[04:13] That's.

[04:14] Anuj Jain: So if I even talk about India as well, the companies mix that with security.

[04:18] So most companies think it's a subset of security and also it's a legal thing. So legal is going to take care of privacy and the other part of is already taken by security.

[04:29] Right. So they don't need specialists in privacy or a privacy team like that. So that's the biggest thing I have seen that companies,

[04:37] at least in India in that way we have encryption and all that to prevent data breaches. We have the legal team to create privacy policies and all that and that's it.

[04:46] And because mostly the companies think that they are the owner of data once they do get into their systems rather than the data subject or the user. So they think we are the owner and they don't need to do anything in terms of privacy.

[04:59] So I think that's the major thing. So. And that's, you know, obviously affect the whole ecosystem.

[05:05] Right. So if the top management is thinking like that, the same message that goes to the lower the whole company and then that's the mindset that the companies work with.

[05:16] Right.

[05:16] So this affects the whole mindset thing as well.

[05:20] Debbie Reynolds: So first of all, I'm going to try to remember all the things that you said. You said some extremely key things. So one of the things you said is that some people think that privacy is a subset of cybersecurity.

[05:32] And so they think things like, oh, we have encryption, so we're protecting privacy. It's like, well, that's totally different, right? To me that's different if the difference that I make with that is like that's the difference between having a key to a house and a deed to a house.

[05:47] So for me, I think privacy is more about the right of someone as opposed to how you secure that quote, unquote asset. And then also people thinking about privacy as being a legal thing where I think you and I know that you have to think about it more proactively in terms of how you handle data as opposed to when something bad happens,

[06:10] then,

[06:11] then kind of legal jumps into it. But tell, tell me about that because engine privacy engineering in my view is more about shifting the idea or the strategy around data more left in the organization in terms of how companies look at it.

[06:28] But what is, what are your thoughts?

[06:30] Anuj Jain: So privacy engineering is mostly like you said, you know, moving data to the left.

[06:34] Right. So if I talk about S and P global, right. So it's, it's a very much scrutinized company, right. It has a lot of legislation, legislation over them.

[06:44] So they are very mature in terms of that. Even before I joined. So they are very matured in terms of the privacy legislation and all those scrutinizations. So we had a different team.

[06:54] So there was a legal team that takes your privacy, this privacy technology team, which I'm part of then they are privacy champions in different divisions. They are a quite mature company.

[07:03] Right. And we do work all together and don't interfere. Right. So that's why we have such a matured privacy program.

[07:11] But if I look at other companies, right. So when we, I was in consulting as well.

[07:16] So in terms of privacy generic though, they don't even consider it a different, like a different role. Right. For them, the technology part in privacy as well is taken care by their developers.

[07:28] So what they think is we'll just train our developers as well here and there. What needs to be done at a very low level in terms of, you know, data quality, you have to remake consent, record consent and all of that.

[07:38] We just built in that in our websites and applications and have developer go through a training or something. And then we have the privacy engineers though. That's how Many companies think in my view but yeah, in S and P it's very different.

[07:53] And what I do personally or our team does is so anything that the company does right in our.

[08:00] So currently creating AI systems,

[08:03] bringing in,

[08:04] asking people to use AI more in terms of creating automations every day,

[08:10] embedding websites with new tracking technologies and all that. So everything that is done in terms of technology goes through our team.

[08:18] So legal has their own framework and policies, procedures that this is the, this is what the team needs to follow. But we are the ones who actually test that, assess that from the technology standpoint.

[08:29] So we use different tools, we have created certain automation tasks but also we look at all these requests that comes in as a PIA and other assessment. So once we do have the risk, we try to mitigate that with the help of those owners and see if it's plausible or not to launch it.

[08:46] And accordingly. And apart from that, what we also do is to automate, right. So we create APIs integration with different systems so that we get the data automatically and it's not a burden on different teams to fulfill our request because that's where the friction comes in.

[09:03] So we go onto teams every now and then asking for certain details, you know, fill this form, fill this assessment,

[09:09] give us this data and all that. So that also create friction. This is what we saw. That's why now we are creating automation stuff directly integrating systems internally so that we don't have to reach out to them and all those things also then you know, few things around AI as well just to help us internally.

[09:27] So yeah, I mean so as a privacy engineer this is what we do, you know, assessing all the technologies in the company,

[09:35] automating stuff so that legal it's not all over the email communications meeting part because no one takes privacy is not that much considered safety. Right. And accordingly we have prepared here and there checks and all as well.

[09:48] So that is, that's the thing that we have done. But like I said,

[09:52] for many companies what I've seen it's mostly the developer thing that is taken care of. It's not even treated as a different role.

[09:59] Debbie Reynolds: Yeah, that's true, that's true. Is either not treated as a different role or not thought of as at all. Or maybe they assume that maybe the cyber people own it or maybe the legal people own it.

[10:13] So I want to talk a little bit about silos.

[10:16] So I always say a lot of companies there, they operate like Santa's workshop where everyone has their own little part to play.

[10:25] But I think that when you're dealing in privacy. You have to see kind of the big picture of how an organization is dealing with data. So I feel like people who are in privacy have to be very good communicators at different levels of the organization.

[10:41] So how do you get that level of like collaboration and coordination within the teams that you're working with in the company?

[10:50] Anuj Jain: Right. So I mean you're definitely right. Privacy cannot be a silo if you want the effective privacy program. So right there has to be communication and yes, privacy people has to be good communicators as well.

[11:02] But mainly how, you know,

[11:05] I've seen privacy driven. So where the companies the privacy is very matured and taken seriously. It all happened through top management. You can't just start it from the lower side because I mean even if how much privacy is, is matured in a company, you how many trainings you did and all that,

[11:23] that is still a legal thing for them. It is treated. We are doing this because of a legislation. It's going to be there.

[11:29] So that's. You need to have it from the top. It has to be top to bottom. Right. So what we have done. So like I said, there's a controls in places, something like.

[11:40] So we don't even allow product to be launched.

[11:43] Debbie Reynolds: Right.

[11:44] Anuj Jain: If they haven't undergone a privacy assessment, right. So it won't be launched. So that's like a red flag given to them. If so basically approval is required for you to launch approval.

[11:56] If you haven't don't have that, nobody will allow you to move that to production. That's one part apart from that, you know, then we have the legal team working with us as well.

[12:05] You can have the legal team. So if by any means somebody is not taking that seriously or don't want to do that, then we work together with legal team to have that person have that team talk to their heads and all that so that they do take it seriously and what other repercussions can be made right.

[12:23] So you have to explain that to them exactly what needs, what repercussions of the company will face when you.

[12:29] If you don't take that.

[12:31] So I mean in honor level. So legal team, the top management all has to work together,

[12:37] right. So you have to convince the top management as well in terms of the fines that can be levied on them. So give examples of what other companies are going through in terms of even if a cookie banner is not there.

[12:48] Debbie Reynolds: Right.

[12:48] Anuj Jain: So or you haven't taken consider there's a lot of fines that are there.

[12:52] So all of these taken together is something you have to create a program working with. But yes, if you don't have legal team, top management,

[13:01] all technology team together, it's not going to work because even if a cycle breaks at any point you have all the policies, procedure or there's no one to check. Right.

[13:10] Technology team hasn't implemented then you have, you are in non compliance. Right. So everyone has to work together and that's how you have to create that. Again like I said,

[13:21] we also have every department as a privacy champions to make sure that so they are more comfortable with that person rather than legal team or specific privacy team reaches out.

[13:29] Because in terms of the way we understand and the language we use, a privacy centric person will use might be a very different in terms of what a person from their department doing day to day work will try to explain them as well.

[13:44] So all these things together needs to be there for the work.

[13:47] Debbie Reynolds: So having that leadership from the top is a theme that I hear a lot and I think it's true that if you don't have that buy in from the top it really doesn't work.

[13:57] And then having champions within the different groups who, who understand and are kind of a liaison with the privacy teams, that definitely helps. But tell me what's happening just in technology in the world right now that's concerning you about privacy.

[14:17] Anuj Jain: So I mean obviously you know the AI is here.

[14:20] So now if you heard, you know there's agentic AI browsers that are launching recently ChatGPT launched their perplexity launch their own.

[14:29] And I think in terms of the free trials that they're giving to everyone, everyone is jumping onto it. Even not thinking about what kind of data they are using,

[14:39] they're they're giving out.

[14:40] Debbie Reynolds: Right.

[14:41] Anuj Jain: So yeah, so I mean in terms of the free tutorials given, free subscriptions given, people are jumping at least in India.

[14:49] So Chat GPT recently announced that for India because we are more than a billion people so they are going to give free subscription to everyone of Chat Chat GPT go.

[15:00] So this, that will convince people to use that right. And do everything with it.

[15:06] Adding sensitive personal information,

[15:08] asking them to giving their data just for any analysis, their financial data, their health data and all that.

[15:16] So this is really concerning the adoptability of AI without the proper legislation in place in many countries.

[15:24] And all that is really concerning for me.

[15:26] Debbie Reynolds: I don't know, I feel like many years ago we have been telling people for the longest time don't put certain things about themselves on the Internet. And so now that you have this tool that's incredible that they can do all these different things with.

[15:42] Sometimes people are so enamored with the capability that they're not thinking,

[15:47] wow, like I'm putting my personal bank information or you know what I'm saying, Like, I'm writing a,

[15:53] I'm writing a letter to the bank and I'm going to put my account number in here. And so just, just part of that, that education definitely has to be there with people, but it's, I think it's just going to become a growing problem because people really don't understand what can happen with their information,

[16:11] definitely.

[16:12] Anuj Jain: So, I mean, in terms of the agent, AI as well, like I said, browsers. So if you want to buy something, you're going to put your credit card numbers for the agent to put in your order, you want to put your address,

[16:22] or if you want to order medicine, you upload your prescriptions.

[16:25] You know, all of that is going to happen in this, you know, agent. For the agents to do that, they need that and people are just going to upload it because the ease it provides.

[16:34] Obviously it's an incredible technology, but yeah,

[16:37] not that they're just giving away their data.

[16:41] Debbie Reynolds: I think I'm more tinfoil hat. I think about some technology.

[16:46] So I'm a technologist. I love technology, but I don't love everything that people do with technology.

[16:53] And I feel like I'm always like, I'm kind of a naysayer, like, say, hey, use it this way but not that way,

[17:00] or think about this and not that. Like, I remember many years ago there was a company,

[17:05] I think it was Amazon, that was a company, was asking people,

[17:09] you know, give me your biometrics and we'll give you a coupon for $10.

[17:14] And I always say to people,

[17:16] you know, you have to weigh the pros and cons. Like, is that really worth it? Like it's, you know, is that biometric that I give? Is it. Is it worth More than $10,

[17:27] a $10 coupon? So I think you have to really weigh the pros and cons of like what you give and then understanding what could possibly happen to that data in the future.

[17:37] Anuj Jain: Yes, definitely. So I as well, you know, as soon as technology is launched, right.

[17:43] So I don't just,

[17:44] just grab onto it and start using it at least. I also go and find out exactly what's their practices are, what, exactly how it's going to work at the back end as well.

[17:53] Right. So, and I also, at least in my family and friends, I do you know, inform them that, you know, please take care of this and all how to use it.

[18:01] But yeah, but I don't think many people does that. It's only, you know, people who mostly in the security or the privacy fields or the legal field do that.

[18:10] But yes,

[18:11] and the kind of exposure AI has at least, you know, the, I mean, it's for everybody. It's not a age limit kind of thing. Even old people or a student in college, everyone can use it.

[18:23] Right. It isn't for just a few people to use it for like other technologies.

[18:28] It doesn't have a limited usage. So it can be used by everyone,

[18:32] every age group, every gender, every person.

[18:35] So yeah, it has a much more effect.

[18:38] But obviously, I mean, it's an incredible technology to use, but you have to use it wisely. Yeah. You can't be just,

[18:43] I mean, you can't live without using AI. But yeah, you have to see how exactly are you using it.

[18:49] Debbie Reynolds: Right,

[18:50] that's true.

[18:51] I want to talk a little bit about privacy and culture.

[18:55] So I think different jurisdictions have their own regulations around privacy and data protection.

[19:04] But then also I think there is a cultural element to it as well,

[19:08] where different jurisdictions think about privacy or data protection in different ways. But I want your thoughts.

[19:17] Have worked with many different types of companies around the world. And so especially people who work in,

[19:24] who understand the way things play in different jurisdictions. I just want your thoughts about privacy and how it attaches with culture.

[19:35] Anuj Jain: Yes. So like you said, so privacy and culture, it has a big role to play. Right. So if I talk about people in India. Right. So in India, privacy became a fundamental right, I think not so long ago.

[19:48] Right. And earlier I don't think privacy was something, you know,

[19:52] people think of it as a right or important thing as well, I would say essential thing. Because,

[19:58] you know, even in India, even you are sitting with someone, you just, you know, talk everything to them. Even now when you're traveling on a train, you know, there's people sitting around you, you'll, you'll tell them everything.

[20:09] You know, that's the culture in India. You know, you tell them out there, your, your child, your parents, what they do, how they do and all that, just to make conversations and all that.

[20:17] Debbie Reynolds: Right.

[20:17] Anuj Jain: So that's is what a culture.

[20:21] So there aren't many people who are concerned about privacy today as well in India. This I would definitely say they don't even think twice before doing it in terms of again, why,

[20:30] how it's going to impact me. Right. And we still don't have A legislation, I mean we have a legislation I think last two years but it's still the rules are pending.

[20:38] It hasn't been implemented yet. There's no date of implementation for the companies.

[20:43] So in India I don't think people. And it's going to take a lot of time because many people live in too little. They don't even know how to use Internet right now.

[20:50] Right. So previously something that's very a far aspect for them at least. Right.

[20:56] And if I talk about the other regions like European or US over there, people do take to are concerned about privacy more. This is what I've seen. They if you ask them at least a question will come why do you need that?

[21:08] Right.

[21:09] So even if in our organization, if I'm talking to someone from the US and European, they would understand that's why we are restricting the use.

[21:17] And they do understand what sensitive data by default,

[21:21] even if you don't have GDPR and anything, they do understand what sensitive data and what the normal data is and where to share. At least they have the culture of asking why.

[21:30] Right.

[21:30] And in other regions, you know,

[21:32] there's not even a culture of asking question in terms of why are you taking this. Right.

[21:37] So that's different. Definitely a mindset. I think this is because also of in European I think in 1993 there was a law that came in and all that in India now we are talking about it.

[21:49] So that depends, you know, all of.

[21:51] So yeah, so the culture is like that as well. So culture do affects this and these are the different type of people that I've met in terms of, that I've analyzed in terms of various continents.

[22:04] Debbie Reynolds: I have a thought about AI and privacy and I want your thoughts.

[22:09] So in my view when people are implementing or using AI systems it may create more privacy risks because they may like you said before, they're using it for a lot of different things and a lot of especially a lot of AI.

[22:30] It was made in the past, was very narrow and purpose built. And so now you have some of these AI systems that can do a lot more things and people are putting a lot more types of data into these systems.

[22:42] And so I think that alone makes the privacy risk broader for companies and for people. And so I think it will create more risk that needs to be assessed in a different way maybe than maybe a different type of application.

[22:59] Like let's say you had an accounting application that didn't have any AI. You would assess that differently than if you had brought in a large language model into our organization or Something.

[23:09] But what are your thoughts?

[23:11] Anuj Jain: So like I said earlier, AI is something doesn't limit the usage or the age group. It is for everyone. So it definitely raises the stake, right? It can be used for everything.

[23:21] And if you talk about it's coming everywhere, it's in your smartphone, it's in your laptop, it's in your applications.

[23:27] I think now the smartphone companies is coming up with the personalized AI which will definitely, you know, like a personalized agent for you, which will work, you know, on your data.

[23:37] So definitely raises the stake,

[23:39] you know, in terms of the risk that are, you know,

[23:42] in terms of privacy. Right? But you know, in terms of the assessing, like you said, you know,

[23:47] for AI, you know, I would say you don't only need to assess, you know,

[23:52] the how the company is collecting data, right? We also need to assess how they're using it, what they are using it for,

[24:00] right? So earlier in terms of gdpr,

[24:03] if you talk about something like you would restrict transferring of data for company X to Y, right? But now the other company, a third party is coming and scraping the data off your website, right?

[24:15] But that doesn't require a notice to be given,

[24:17] right? A transfer required, but scraping of data doesn't require a notice, right? You don't, companies don't even give you a notice for that.

[24:27] Similarly,

[24:28] again for you,

[24:32] there's a right to gd, right? To stop the automated processing of your data, right? But now there's no like, like a right to refuse to process my data, right? So you are only looking for the AI or the automated processing shouldn't be there, but you're not,

[24:50] you can't be like how. What data are you putting into it, right? So you are just assessing the kind of data is collected, but now,

[24:56] but you are not able to.

[24:58] There's no right to assess or take a risk in terms of what the algorithm is being used to come to a data,

[25:05] right? You can't just.

[25:07] Debbie Reynolds: Right?

[25:08] Anuj Jain: You can rectify your data, but you can't rectify your output coming out of AI, right? So let's say for example a credit bureau, you can rectify your address, credit score and all that, but you can't know how a credit score is being come out.

[25:23] So the legislation meeting that part. So you have to make sure that you are assessing all parts of it, the whole data life cycle when you are assessing AI rather than only the collection part or a few parts here and there,

[25:34] right?

[25:35] So that is something you would do differently.

[25:39] Debbie Reynolds: Totally. I agree.

[25:40] And I do a talk about data life cycle So I say data has a life from cradle to grave. And a lot of times, as you say at a lot of companies, they're very good at collecting data and they used to focus a lot of their attention on that first intake of data.

[25:55] But then now the AI is here. They have to think about how the data flows throughout the organization and be concerned with what goes in and what comes out. Right.

[26:06] Just like you said, with the credit score information,

[26:09] you may not know what happens inside the system,

[26:13] but all companies or organizations will be accountable for what comes out and how they use what comes out of those systems.

[26:21] So thinking about it in terms of a life cycle, I think that has to be the cultural shift for organizations where they were maybe more concerned about the provenance or how data started within the organization as opposed to how the data flows through the organization and how they're going to use it all the way to the end of their data life cycle.

[26:41] What do you think?

[26:43] Anuj Jain: Yes, definitely. Like you said, so the organization need to look at the whole. They can't just look at the data collected on a personal level, but also the algorithm they are using in terms of, to check fairness, biasness and all that as well.

[26:56] You can't just use any data. You have to take care of data quality more than ever now so that you get an output which is, which is, which is less biased and is fair to everyone.

[27:08] Right. That we have, I think we have seen some examples as well. I think there was some police,

[27:15] so the state police were using certain AI which was showing black people as more and all that. Right. And I think the clear view AI as well,

[27:23] an HR system where it was referring girls to more HR jobs and the males to more engineering related jobs. So these are all the examples where the biasness and fairness comes in.

[27:35] Right.

[27:36] So yeah, I mean we have to look at other aspects when it comes to assessing AI. Right. Not only in terms of gathering notice and all that, because there's a lot more involved in that.

[27:49] Debbie Reynolds: Right.

[27:50] Anuj Jain: So we give notice to, for collecting data, but not for the data that is generated basis of the data. Right. You don't give notice to that,

[27:58] but the data generated is equivalent to even if you're collecting data from someone.

[28:02] Debbie Reynolds: Right.

[28:03] Anuj Jain: So all that needs to be taken care of. Right?

[28:06] Debbie Reynolds: Yeah,

[28:07] it's a big deal. It's a big job and it's only getting bigger as you say,

[28:12] because so many,

[28:13] you know, I remember early on a lot of organizations were like, oh,

[28:18] we don't even want to use AI. And it's like, well, you can't stop it. Right. Because it's in every tool that you come in. It has AI in it. So you really have to deal with it and be able to manage it the best way that you can.

[28:32] So, Anoosh, if it were the world according to you and you had your wish for privacy or data protection anywhere in the world,

[28:42] what would be your wish?

[28:44] And whether it be human behavior,

[28:48] technology or regulation.

[28:53] Anuj Jain: I would say human behavior first. Right.

[28:55] Because I mean, this is how the market drives. If everyone in the market, everyone in the country is concerned about their privacy, they're only going to accept one, which is privacy first.

[29:07] So the people are the ones that brings in the money to the company and that's the one that they want to take care of. Like you gave an example earlier, right.

[29:14] That Amazon was giving $10 coupon in exchange of biometric data. But if the people are more concerned, human behavior was to restrict.

[29:22] Ask why first.

[29:24] Debbie Reynolds: Right.

[29:25] Anuj Jain: The company won't be able to do that. Right. They are only able to do that because people aren't that concerned. Right. First thing I would obviously do that is the human behavior.

[29:35] Debbie Reynolds: I agree with that.

[29:37] And I think sometimes we feel like technology can solve all of our problems,

[29:43] but I don't think that's true. I think technology can enable us to help solve problems, but that human element has to be there.

[29:51] Yeah, yeah.

[29:52] Anuj Jain: Because technology is going to also be created by big techs. What the people wants. Right. If it's, I mean, it's all about the money that comes in for them. Right. If it's coming from the people using a lot of AI, then it's gone.

[30:04] They are going to create that. Right. If people stop using it, if only they want the privacy first, then obviously, you know,

[30:10] they won't. They will, obviously they will have to put in certain checks right. Before bringing it up.

[30:17] Debbie Reynolds: That's true. I think people don't realize how much power they have because companies want to make money. And so they feel like they're doing something that will upset you. Even if it's not something that's regulated, they'll stop or they'll do something different.

[30:33] So not forgetting about your human right and try to make the best choice that you can with data and think about it before you give your data away. That's, that's great advice.

[30:46] Great advice.

[30:47] Well, thank you so much. I'm. Thank you for staying up late for being able to do this call and I'm really excited to be able to share this episode with everyone.

[30:56] So wise, wise advice. Thank you so much.

[31:00] Anuj Jain: Thank you, Debbie.

[31:02] Debbie Reynolds: All right, talk to you soon. Thank you so much.

[31:04] Anuj Jain: Thank you.

[31:05] Debbie Reynolds: Bye. Bye. Okay, bye. Bye.