Insights@Questrom Podcast

Navigating Internet Privacy: Unpacking Cookies, Dark Patterns, and Future Data Protection Challenges

Boston University Questrom School of Business

Unlock the secrets of internet privacy as we journey through the intricate world of digital cookies and their profound impact on our online lives. Featuring insights from Garrett Johnson and Tesary Lin of Boston UniversityQuestrom School of Business, we explore the evolution of privacy regulations and their influence on major browsers like Chrome. Discover how first-party and third-party cookies have transformed digital marketing, and learn about Google's innovative Privacy Sandbox initiative, which seeks to balance user privacy with the needs of advertisers and publishers.

We also shine a light on the controversial "dark patterns" in user interface design that can manipulate cookie consent rates, challenging the effectiveness of privacy regulations like the GDPR. Delve into the complexities of data protection with a focus on the Global Privacy Control and the potential for a unified U.S. federal privacy law. As we navigate the emerging landscape, we address the responsibilities of AI platforms and social media in safeguarding user data and managing content. This episode promises to equip you with a deeper understanding of the ongoing challenges and future implications in the realm of internet privacy.

J.P. Matychak:

Thank you, greetings everyone, and welcome to another episode of the Insights at Questrom podcast. I'm JP Matychak and alongside me is my co-host, Shannon Light. Shannon, how are you? I'm great, thanks, okay.

J.P. Matychak:

Well, today we're talking about cookies. Not those cookies. That's a different podcast altogether. No, we're talking about digital cookies and internet privacy, and if you've been anywhere on the internet lately, you're all too familiar with the banner across the bottom of pages informing you that the site uses cookies and asks you to accept the use of them. And if you're anything like me, you accept the statement without fully appreciating what that even means.

J.P. Matychak:

But cookies may not have much life left in them. Since 2020, google has been working to eliminate the use of cookies across its Chrome web browser, which, according to statcountercom, dominates the browser market with over 65% of users worldwide. Since Google made their announcement, we have seen significant changes and they've been testing their privacy sandbox with the goal of phasing out the use of all third-party cookies by the second half of 2025. Here to talk to us about cookies and the broader issues around internet data security are Garrett Johnson, associate Professor of Marketing, and Tesary Lin, assistant Professor of Marketing, both from Boston University, Questrom School of Business. Garrett, Tesary, thanks for joining us. So let's start by level setting our listeners. What are cookies exactly?

Garrett Johnson:

Well, you see, there's chocolate chip cookies oatmeal. Nobody likes oatmeal raisin ones, but the chocolate ones are my favorite. That's my favorite.

J.P. Matychak:

It says a lot about me. I know that's my favorite. That's my favorite.

Garrett Johnson:

It says a lot about me, I know.

Garrett Johnson:

So if we're talking about browser cookies, browser cookies essentially allow the web to have a memory.

Garrett Johnson:

So, for a browser cookie, if you would go to a website and you would find a product you want to purchase and you'd add it to your cart, when you'd go to the cart, the website wouldn't know that you'd added the product to your cart, and so it wouldn't be a very satisfying experience.

Garrett Johnson:

So a cookie is just a text file with an identifier in it that identifies an individual user, and the example I talked about was a first-party cookie, meaning a website puts a text file on your computer to remember who you are. So what's more controversial is what's called a third-party cookie, and with a third-party cookie, the third party refers to vendors that the website works with to do work like target advertising, and in that case, the browser is interacting with a third-party domain owned by a company like Google. That creates a third-party cookie, and that identifier still has that same idea of having a memory to the internet, but now it allows these different vendors to connect your behavior across websites and, critically, what that allows them to do and what transforms digital marketing is it allows them to connect eyeballs to wallets so it can see who sees an ad and then who subsequently purchases.

J.P. Matychak:

So it sounds like for businesses, this has been. This was probably one of the bigger innovations in digital marketing and e-commerce to be able to track customers and be able to pull them back in. So why are cookies going away? I mean, you said some of them are controversial with third parties and whatnot, so why are they going away? And why are companies who I imagine have profited, like Google, why are they making the move to eliminate the use of them?

Garrett Johnson:

Yeah, I think that there's a sea change in how people are thinking about privacy. Part of it's coming from large government regulations like the GDPR in Europe.

J.P. Matychak:

And the GDPR is.

Garrett Johnson:

Sorry, the General Data Protection Regulation in Europe, which is just a kind of generational change in privacy regulation. It's quite an all-encompassing regulation and just public sentiment shifting and caring more about privacy and maybe being more negative towards the tech sector. And so these combined forces mean that the large browser vendors have started to take it upon themselves to get rid of cookies, and so actually, chrome is the last major browser to still have cookies. Safari already blocks third-party cookies, so just Firefox. And what's different about Chrome is that Chrome is saying, okay, if we're going to get rid of these things, you know, cookies do actually have a lot of redeeming features. Let's try to create replacement technologies before we just get rid of them. Interesting.

J.P. Matychak:

So let's talk about the privacy sandbox. You know we mentioned at the top of the show that Google's been testing this privacy sandbox, and you also mentioned some other replacement technologies for this. So let's talk a little bit about. What is this privacy sandbox? How does it work? What are some of the other tools that maybe some of the other browsers are working with as replacements? Talk a little bit more about this.

Garrett Johnson:

Yeah, so privacy sandbox is a collection of technologies and their goal is to preserve the benefits of cross-site identity that you get from third-party cookies while offering superior privacy protection to users. So there's a few parts that's worth unpacking there. One is that it's a collection of technologies, so right now, third-party cookies are a very simple technology, but they allow many use cases, like the ability to target ads, to measure ads, to reduce fraud and just track user behavior and visits to websites, for instance, and the benefits is that people get more useful ads. For them, publishers get more revenue. Our research suggests that publishers get about double the revenue when they have third-party cookies, so it's very valuable to publishers and users get free content and more useful ads. Now the privacy protection is interesting because consumers will still be seeing similar ads to what they're seeing now, targeted based on the website they're browsing. But it's going to be hard for any one company or government to put together user behavior across websites in a way that it is potentially possible today.

Shannon Light:

I know I might be jumping ahead here, I know.

Garrett Johnson:

I might be jumping ahead here, but I am very curious to know what does digital marketing really look like, digital advertising look like without these cookies? Yeah, so I think that there's three main things. Obviously, privacy Sandbox is a big or sorry four main things that kind of replace third-party cookies. So one main thing is privacy sandbox. Another is falling back to contextual targeting. So if an advertiser doesn't know anything about you, they still know that you're on like a finance-related website, and they can show you a finance-related website sorry, a finance-related ad.

Garrett Johnson:

Certainly, the largest companies in this space will continue to be dominant. So the Facebooks, the Googles, the Amazons of the world. They maintain a lot of data about you that they own themselves, and so that puts them at a relative advantage, and because these cross-site identifiers are so valuable to advertising companies. Basically, if you get rid of one technology, then historically like, something that looks very much like that takes its place. So one way, one shape that that takes is that increasingly, you get websites asking you to log in when you visit their websites and what's going on is that they're using your email address, they're encrypting it and using that email encrypted email address as your identifier in place of the third-party cookies. There's also kind of more surreptitious ways of doing that, called fingerprinting, that tries to use information about your browser, like your IP address, in order to identify who you are. So yeah, if you kind of get rid of these technologies, life finds a way.

Shannon Light:

It's very valuable to advertisers and so it kind of comes back to life and through all of that putting in your email address Like it's very valuable to advertisers and so it kind of comes back to life and through all of that putting in your email address or, like you said, even tracking IP address, that then gives marketers there the ability to track geographically and target them geographically.

Garrett Johnson:

Well, track them as an individual. Well, track them as an individual. So if you are visiting a Nike website and you buy Nike shoes and then you go visit Yahoo, where you're a logged-in user, then Nike can find you at Yahoo and show you a Nike ad to you. But in order for this to work, you go away from kind of a more permissionless system where the cookies are kind of being installed on people's browsers, at least at the beginning, without any interaction from them at all. Now you have to consciously log in and you have to log in on both sides in order to make this connection and so, effectively, what this does is that, like, it reduces the amount of free flow of information, it reduces the scale of this kind of targeting and, you know, it creates more control over this. So it's less kind of permissionless, but it also makes it so it's harder to enter the space, especially if you're an advertiser or a publisher that doesn't have a lot of relationships with your customers, where you get email addresses.

J.P. Matychak:

So, tess, your part in this has been a lot about data privacy and data security. You know, an impetus to a lot of these changes has been, as Garrett said, the shifting sentiment around about data privacy and people just becoming more aware of data privacy. So how has data privacy issues and we hear about breaches all the time. I mean, it seems like I get a new email every single day about another company that you know had a breach and lost my information. How is this impacting the way the world is changing in the way of cookies and other ways of tracking consumer information and dispersing that information?

Tesary Lin:

Yeah, that's a very good question. I think 10 years back it's quite plausible that consumers are, by and large, not aware of companies' tracking practices. But given the recent development of privacy regulations and also all the changes from Apple and from Google, consumers are increasingly aware of other data collection and usage practices. Now that doesn't necessarily mean that, while consumers are not willing to share so in my own research consumers some of the time they are willing to share their data to some of the companies, but it is true that while they are more discriminating towards what companies, they are willing to share. So, something that we have done recently, we have run a large-scale field experiment with Andre Fraken here in Kostrom and also Chiara from NATO in Harvard.

Tesary Lin:

So basically what we did is to try to look at the influx of consent banners, kind of triggered by the GDPR. What does that do to, kind of you know, to the consumers in terms of their willingness to share their data with websites?

J.P. Matychak:

And so those consent banners sorry, are those little banners I talked about at the top right, where it's like you get to the website and it says, hey, this uses cookies and are you giving us permission to use those? So when you say consent banners, that's what we're talking about, exactly. Great, okay.

Tesary Lin:

Exactly so basically, we run an experiment and trying to see when the consumers interact with these websites and when they see a banner, what's the percentage of the time they are willing to share their data with the company, and the baseline sharing rate is around 60%. Now, 60% is actually higher than what we would expect, because if you actually compare the result with what people have been reported from after Apple pushed their own consent banners on iOS platforms, the steady rate is around 27%. So this is actually higher than what we expect, but this is not 100% and I think most of the time what advertisers want to see is 100%. Now, this is also the baseline rate when the website or company doesn't do any optimization to how they elicit data from the consumers. Depending on the specific user interface design, the concern rate can go up to, say, around 80. Or, in some of the designs that we have tested, the probability of rejecting all cookies could go down to almost 5 percent.

J.P. Matychak:

Wow, what are some of those changes that impact that? You said you tested different designs, so what makes someone more likely to say yes or opt into there?

Tesary Lin:

Yeah, so we have tested different types of visual designs and this is something that, well, if you look at the news, sometimes they would refer to it as dark patterns. So these are different visual elements that would nudge you know, nudge consumers into kind of certain actions. So the specific design patterns that we have tested, one of them is hiding the specific action from the main user interface. For instance, if I'm a website, I don't want the consumers to click the reject all cookies. So if we do that, then that is going to be, kind of, among all the basic design elements, that is the one that is more effective. It tends to increase the kind of like concern rate by around, I think, 10%. And then another design well, another design element that is effective is default. So if you set the default as sharing all the cookies, then it's going to be effective in nudging consumers into sharing.

Tesary Lin:

Now, default is kind of tricky for the website to use because, if you look at all the recent implementation of GDPR, a lot of the countries are actually banning the use of defaults. So something that we have seen is that companies are actually shifting alternative design patterns in order to nudge consumers into sharing. Now, on the flip side some other patterns that we have been testing, something that only changes the visual elements, for instance, kind of graying out certain options or re-ranking the options. For example, if I ranked the sub-all to the top versus at the bottom, that doesn't actually change the privacy choices that much. So. Kind of the takeaway is that if you only changes the pure visual element, that is not actually going to affect privacy choices by a lot, at least in 2024. There may be an element of, well, consumers getting acclimated and kind of get used to different design elements. On the other hand, designs that actually make it harder and more difficult and more time consuming to perform a certain action, those remain effective.

Shannon Light:

To your point about one of the more effective ways being actually making it almost hidden the consent banners. I'm just curious how is that allowed?

Tesary Lin:

That is a good question. I think the well. I'm not a law schooler. My best explanation of the reason that it is currently quote unquote allowed is that, well, the regulators are still not catching up to the evolving banner design practices and it's more or less a catch and seek game in the sense that, well, the regulators are playing this whack-a-mole and they are trying to catch the most manipulative patterns that work. But given the fact that most companies are having all you know, they have all the infrastructures to test and optimize their consent banners, it is very plausible for them to find the next banner design that works.

Garrett Johnson:

I'd like to jump in on that a little bit. I think this is a symptom of the really troubled relationship that we have as a society with privacy, because we want to live in the data economy and we get a lot of benefits from that, but we also want to maximize our privacy, and so the GDPR, in particular in Europe, is pushing a view of privacy that makes it so that people should be providing opt-in consent and, as you say, it shouldn't be possible to hide your banners. So why does these sort of things persist, at least in some parts of the EU, like six years after the regulation was put into place? I think it's because it's actually extremely valuable to websites to be able to collect people's consent and thereby monetize their ad impressions. To the fact of doubling their ad impressions, to the fact of like doubling their ad impressions and, um, I think what's challenging here I mean telstra makes a good point about the technical challenges here, but, like I think a big challenge from a regulatory regulator perspective is that you have the mandate from the public to increase people's privacy.

Garrett Johnson:

You don't have the mandate from the public to go and shake down websites and like, reduce their revenue by a factor of like 50%, and so this creates this sort of tension. That means that some regulators have been more aggressive in pushing designs that Tessery points out would increase or decrease consent rates, but others have just kind of backed off, and I think this is a really challenging part of this.

J.P. Matychak:

Do either of you see a world in which this type of consent would be standardized, in the sense of how it's implemented, or do you think that that's just a bridge too far for many regulators to sort of mandate what technologies you're going to use? The reason why I ask is I found it interesting that on some websites I can't do a thing until I acknowledge that banner and others I can scroll away and never have to do anything with that banner, and so it just goes to your point where people are still testing and some have it hidden and it's like, as long as you have it somewhere and I don't have to give you a banner. So do you see any type of future where that's standardized and what would be expected?

Tesary Lin:

I see some version of the standardization floating out as a proposal. So, in particular, well, this may well I don't actually know whether this would count as standardization but one thing that the regulators have been discussing as a way to kind of reduce the impact of, well, kind of non-standardized banners is what they call global privacy control. Now, what is global privacy control? It's basically saying that, well, I want to give the users, the individual consumers, the option to turn on or off the cookie sharing at the browser level. So that is the sense of which it's called global.

Tesary Lin:

Now, the current version of global privacy control still says well, if a user turns off the privacy choices at the browser level, but later individual websites ask for that and get consent, they can still override that. So the devil is actually in the details, because, well, imagine the situation where a consumer turns off the cookie sharing at the browser level Individual websites they will have the incentive to continue nagging the consumers to share their data. Going back to Garrett's point, well, websites have the incentive to collect more data. So it is a bit hard for me to see how that whole kind of proposal, once it's implemented, is going to play out. But that has been kind of something that has been discussed a lot by the regulators.

Garrett Johnson:

Yeah, I agree. I think it's a salient change to the internet we've seen from these privacy regulations, and I think for many of us it's not a good experience to have these things pop up in our face all the time, and so Tessery mentioned the global privacy control. This is actually something that the California privacy law has explicitly pushed for, and the regulators are actually going after companies that aren't respecting that, so we might see more of that. So I think the challenge, though, is that, like standardizing these consent processes could probably be good for consumers, but I think the very nice finding that you get from Tessery's research is that exactly how you set that up that decision and how you frame that decision is going to massively affect the consent rates, and so I think the scary thing for stakeholders here is who sets these defaults. Is it firms, is it browsers? Is it government? Because that's going to be enormously consequential.

J.P. Matychak:

And there's even variation. Sorry, I know you're about to ask, but I want to follow up on something, especially since there's gaps or a lack of reconciliation in some of the regulation. As you mentioned, gdpr, I mean, that's the regulations in Europe, right, and we're not necessarily beholden to those same regulations here in the US, unless you've got workings in the EU and other places overseas, in the EU and other places overseas. I imagine that for many companies, especially global ones, this is difficult to kind of reconcile regulations here abroad, other places. Are you seeing a call for just hey, let's just have one universal privacy protection kind of regulation for Internet privacy, or do you think that that's something that people are just going to avoid? Best guess or what you've been hearing?

Garrett Johnson:

I think that, from a firm's perspective complying with a European privacy law, we're getting up to like 15 different US state laws different laws throughout the world gets to be a substantial headache, and so there's a little bit of a kind of highest common denominator effect of just comply with the strictest possible law and then just set it and forget it and be done with it.

Garrett Johnson:

The strictest possible law and then just set it and forget it and be done with it. And that's maybe why you start to see these more GDPR-consistent consent banners on websites that really aren't interacting with EU users. So I mean one thing to your question about what comes in the future. One potential thing you could see is a federal privacy law in the US that would start to level set things, but in the meantime, these companies are just basically going through the full tree of like. If a user's coming from this place, then this is what we're going to show them and these are the standards that we're going to apply, trying to make it as consistent as possible, but that's. It's created a business for some companies.

Tesary Lin:

Yeah, I think, on the regulation side. So both Garrett and I have been following privacy regulations for a long time. I think something that we see is that, well, there are willingness to push for a federal-level comprehensive privacy law that is standardized, but the proposal has received a lot of pushback because, for instance, places like California they want to stick to their version of the privacy law, which is more restrictive and more privacy protective, and most of the privacy bills at the federal level they are much lenient than that. Therefore, there is this tension between individual states wanting to kind of implement their own version of privacy law versus kind of the federal level. They want to kind of make everything standardized. So, in practice, my personal conjecture is that, well, if you want to see a version of the future where we have a standardized version of privacy regulation, that is going to be pretty hard, even within the national level. If you want to see something that is standardized across the continent, that is going to be even harder.

Shannon Light:

Yeah, I mean, with the rise of OpenAI, ChatGPT, I always wonder the amount of information people are putting into those types of platforms. What is the regulation around that? And I know that it's spitting back out the knowledge it has from being built, but can you explain kind of the regulations around a platform like artificial intelligence? Who's that going to be?

Tesary Lin:

So I'm not actually aware of regulations that specifically target kind of generative AI and these large language models so far. There might be something that gets proposed later. There might be something that gets proposed later. But what I have been seeing is that, at least if you look at the company users, they are very concerned about it and they have been putting various restrictions preventing their employees from using these platforms. So I think this is a very interesting example, because here you actually see that well, if you look at this specific product, where the users are actually companies, they are putting very active measures to kind of protect the company's privacy, which kind of you know concretely means intellectual property rights or other types of business secrets.

J.P. Matychak:

It seems to me that you know, we've I think I mentioned off air we had a conversation around, you know, social media platforms and their responsibility for the editorial content and people being censored and whatnot, and it seems like that was the next wave of regulation and policy that we were just going to have to deal with. It seems like this one is the next wave, like the data protection, the privacy Again. As we said. It seems like this one is the next wave, like the data protection, the privacy Again. As we said, it seems like every day there's just a new story. Look out to the future.

J.P. Matychak:

We, of course, know where we are now. Where do you see, maybe, the big red flags of what we're going to have to deal with next, either as regulatory bodies or as a society, when it comes to data protection over the next five to 10 years? You know what are the things that are just like what aren't we talking about? Yet that, as you've done your research and you've looked at these things, you're saying like boy, you know what this is. If we don't resolve this, we're gonna have, we're gonna see some issues.

Garrett Johnson:

I think there's. We still have this longstanding tension between you know again wanting to live in the data economy and getting the value that's created by data. We're a business school. We teach analytics to our students. It's a huge part of how modern business works, but of course, we do want to improve privacy at the same time, and I think that the regulations that have tried to do so, like the GDPR, have had some substantial downsides to the data-driven economy, and one thing in particular is a harm to competition, and so I think we have to wrestle with you know, how do we create a better privacy regulation that balances those two goals?

Garrett Johnson:

And I think one thing that's really interesting that Privacy Sandbox speaks to is that there's new technologies developed by computer scientists called privacy enhancing technologies, and what they try to do is they kind of try to let you have your cake and eat it too, like try to have privacy but also allow data to be used in ways that creates value, and I don't see maybe enough regulators thinking about that very important issue, because you know, it's kind of a weird technology.

Garrett Johnson:

We talked about AI. Ai is going to diffuse itself because everybody wants to use this great technology. Privacy enhancing technologies they're kind of worse for firms. So if you want them to diffuse, then you need to create some social incentive for these technologies to actually be taken up by firms. And one thing that I am a little bit sad to see is that, with so much focus on cookies, in some sense we're focusing on solving like last year's problems, or the problems of the last two decades, and not thinking about okay, well, there's the possibility of replacing them with a host of technologies like those provided by Google or proposed by Apple, or those proposed by Microsoft. You know, what are we as a society going to do to try to maybe make that third way, with all of its trade-offs, something we should be considering as well.

J.P. Matychak:

Well, it'll certainly be interesting as we continue to navigate all of these things where people want their how to change it, their cookies, and they want to eat it too. Right, there you go. I had to bring it all full circle. Well, it'll be, oh the rim shot earlier. So, Garrett Tessarie, thank you so much for joining us and helping us make sense of this all. We really appreciate you coming on the show with us today.

Tesary Lin:

Thank you for having us.

J.P. Matychak:

Great Well, that'll wrap things up for this episode of the Insights at Questrom podcast. I'd like to thank our guests again Garrett Johnson, associate Professor of Marketing. Tessery Lin, assistant Professor of Marketing at Questrom School of Business. Remember for more information on this episode and previous episodes, along with other insights from Questrom School of Business experts, visit us at insightsbuedu For Shannon Light. I'm JP Matichak. So long.