AHLA's Speaking of Health Law

Data Privacy and Security Risks in Technology-Enabled Health Care

AHLA Podcasts

The digital health landscape has changed the way that patients are tracking and managing their health care, raising new concerns around data privacy and security risks. Wes Morris, Senior Director, Consulting Services, Clearwater, speaks with Bethany Corbin, an attorney who provides strategic guidance and legal counsel to health care innovation companies and Femtech, about the major privacy and security concerns that have arisen under the current landscape, specifically around telehealth, apps, and wearables. They also discuss emerging enforcement trends, including the Federal Trade Commission’s increasing role, and privacy and security best practices. Sponsored by Clearwater.

AHLA's Health Law Daily Podcast Is Here!

AHLA's popular Health Law Daily email newsletter is now a daily podcast, exclusively for AHLA Premium members. Get all your health law news from the major media outlets on this new podcast! To subscribe and add this private podcast feed to your podcast app, go to americanhealthlaw.org/dailypodcast.

Speaker 1:

Support for A H L A comes from Clearwater, the leading provider of enterprise cyber risk management and HIPAA compliance software and services for healthcare organizations, including health systems, physician groups, and health IT companies. Their solutions include their proprietary software as a service-based platform, I R M Pro , which helps organizations manage cyber risk and HIPAA compliance across the enterprise and advisory support from their deep team of information security experts. For more information, visit clearwater compliance.com.

Speaker 2:

Hello and welcome to this episode of the American Health Law Association's podcast. Speaking of health law, I'm your host, Wess Morris, senior Director of Consulting Services at Clearwater. Joining me today for a return appearance is Bethany Corbin, an attorney who provides strategic guidance and legal counsel to healthcare innovation companies and fem tech about the challenges healthcare organizations face when attempting to comply with different regulatory frameworks. I'm also very pleased to announce that Bethany was recently honored as a recipient of the American Bar Association's on the Rise Top 40 Young Lawyers Award, which provides national recognition for a b a young lawyers who exemplify a broad range of high achievement, innovation, vision, leadership, and legal and community service. By continuing to provide her perspectives through this podcast and a variety of other channels, it's obvious to us why she was selected. Bethany, it is a pleasure to have you back.

Speaker 3:

Thank you so much, Wess. Pleasure to be here and and truly appreciate those kind words.

Speaker 2:

Ah , absolutely. My pleasure. Let's start with something that, that really came to the forefront in Covid and post Covid and , and that is how the digital health landscape has changed , uh, the way that patients are tracking and managing their healthcare conditions at home , um, including statistics around app usage, those sorts of things. Tell me why this is a concern and what your thinking here is.

Speaker 3:

Absolutely. So as we know, really whenever Covid hit, it ushered in a revolution with respect to digital healthcare. So before covid, you know, we had moved things off to electronic medical records, right? There was some digitization of patient portals, patient information, but it wasn't something where consumers were themselves actively tracking their health conditions to the extent that we now see, and it also wasn't something where telehealth itself, you know, was kind of the, the big boom, right? Right. Covid, there was, there was, was of course the concern that going to a healthcare provider's office could unnecessarily expose you to covid . A lot of individuals weren't comfortable making those in-person visits. And so during that time we saw a huge shift in the healthcare landscape to moving things online and moving it to digital and also at, at-home locations. Mm-hmm . <affirmative> . And what I mean by that is there was a huge proliferation in the number of tech companies coming on the market offering at-home digital health solutions like the apps and the wearables that everybody is obsessed with right now. Um , right . And then there was also a rush on the provider side to building those telehealth platforms so that they could continue providing care to their patients who were uncomfortable coming into the office. And so we had kind of a mass movement online and at home, and now we've started to see that even become more permanent now that the Covid pandemic and the emergencies right. And the public health orders around it have started to die down. This has become something that is now embedded and ingrained into our healthcare culture mm-hmm. <affirmative> . And we even started to see kind of the emergence of virtual first and virtual only care offerings from some of the larger healthcare networks and insurance companies as well. Um, and we've started to see just more and more of these apps and wearables being developed and designed so that patients themselves can have control over their healthcare data and really understand what's going on with their bodies. So that's kind of the background of , of how we got here today. Right . Now, Wess, I know that you asked him about the problems that I'm seeing with this landscape, and so there's a couple, actually, you know, the first, and, and I know we'll get into some of these , uh, going forward, so I won't spend too much time, but you know mm-hmm . The first is when Covid hit, things moved online so rapidly that there wasn't necessarily a huge emphasis on building privacy and security protections into the platforms and the apps and the wearables that are being used today, because there was instead a need to move things online as quickly as possible in addition. Right. We didn't necessarily have the frameworks that we have in place , um, or that we needed in place , um, that existed back when we had kind of, you know, those paper records and in-person meetings, it opened up a lot of different verticals of risk, not only with privacy, but with cybersecurity Now. Um, and, you know , the value of health information is so great that we're seeing more and more cybersecurity attacks as well. Right . And then we kinda have the issue of which privacy laws are applicable because we have all of these mm-hmm . <affirmative> healthcare applications and wearables on the market today. And I think there's a common misconception by consumers that the Health Insurance Portability and Accountability Act or HIPAA's privacy and security rules apply across all of those different apps and wearables that are on the market. And that's not the case. And so there's, there's actually some huge gaps that we have in data privacy and security protections.

Speaker 2:

Yeah. It really comes down to who is the holder of the data as to whether or not , uh, HIPAA may be applicable. And that is either a covered entity or a business associate of a covered entity . Uh, or perhaps we could go in the, in the , um, substance use disorders market, A Q SS O or qualified service organization, which would be essentially a , a business associate. Right, right. Um, but in addition to HIPAA and in addition to the O C R and the Department of Health and Human Services, there's another organization that has also taken a real heavy , uh, interest in this subject. And that is who

Speaker 3:

The Federal Trade Commission.

Speaker 2:

Why

Speaker 3:

<laugh>? Yeah. It's, it's a great question. So, and this isn't something that, you know, when tech companies or other kind of healthcare providers in this digital health space come online, they're not necessarily thinking F T C , right? They're thinking hipaa, they're thinking O C R H H ss. But here's the thing. F T C is actually the one that's doing a lot of these investigations into companies that may not be , um, regulated under hipaa or may be regulated under hipaa, but are doing unfair deceptive trade practices and acts in ways that are harmful to consumers. So I'll take a step back, right? The F T C is not, you know, when we think about the F T C , it's not a privacy specific or privacy centric organization. That's not its main mission. The F T C really exists to guard consumers against unfair and deceptive acts and practices that are in our affecting commerce. Now, we get the tie to privacy because a lot of these healthcare companies, including the healthcare applications and wearables, have to produce what are called privacy notices, or, you know, they're called privacy policies more commonly , um, amongst the, the outside realm. And so they have to have a privacy notice or privacy policy detailing how they're going to use consumer's health data and how they're gonna use consumer's individually identifiable data, right. Personal data, how will it be collected, how will it be stored , used and disclosed? And where the F T C has started to come in is it's found that there's a mismatch sometimes between what companies are saying in those privacy and policies and how they're actually using data. And they're saying that that disconnect is misrepresentative to consumers, and it's an unfair and deceptive actor practice. And that's where a lot of companies , um, and we've seen recently, we've seen, you know, flow premo better help , um, cerebrals under investigation. So we've seen a lot of health tech apps now getting in trouble with the F T C under its unfair and deceptive acts and practices authority. And then we've also seen a relatively new rule come into place, and I say relatively new, it's, it's not new, it's been in existence for more than a decade, but it's just now getting some teeth and some legs to it. Um, the F T C has a health breach notification rule, right ? And that is different than HIPAA's breach notification rule. And it hasn't been used, it's been in existence for a long time, but hasn't been used. And now we've seen two cases come under that health breach notification rule recently for failure to notify , um, when there's an unauthorized use or disclosure of personal health information.

Speaker 2:

Excellent. Um, I wanna circle back on one point that you made and, and you talked about , uh, how organizations may be , uh, collecting and using data. And to my mind, one of the things that is not well thought out or often addressed effectively is secondary use. In other words, you collected the data for one purpose and you told the consumer that you were collecting it for purpose. A, but then because we have so much data now people say, oh, well, can I use this for some other purpose? Um, is that a problem for these kinds of environments as well?

Speaker 3:

It absolutely is. Les , and, and how you can use data is really going to be partly determined by what regulatory structure you're under, right? Mm-hmm . <affirmative> , are you a HIPAA covered entity? Are you a non HIPAA covered entity? Um, and so that's kind of the first determining factor. And what we've started to see though, is even companies that are not regulated under hipaa, they're using data for those secondary purposes, right. Or disclosing it for secondary purposes, like advertising, right? Social media , um, mm-hmm . <affirmative> to those tech giants in ways that aren't disclosed in their privacy policies. And so even that, even though they might not have a prohibition against using that, you know, those secondary uses of the data, they're actually still getting in trouble because they're using data in ways in which they say they aren't going to use the data in their privacy policies. Um, and then of course, secondary uses of data under HIPAA are, you know, can be very concerning, especially if you don't have the right safeguards and protections in place, right? The right, you know, business associate agreement that's limiting how even your vendors downstream can use and disclose that data. Um, we've seen a couple companies get in trouble that way because they are HIPAA covered entities, but they have shared through tracking, you know, technologies , um, some health data to Facebook or Google or Snapchat, Instagram , um, in ways in which they're not allowed to do under the, the privacy rule. So it's a huge problem that we're seeing right now.

Speaker 2:

Yeah. And in some cases, they don't even realize they're sharing that data.

Speaker 3:

That's the scariest thing, right? Is, and I work with a lot of health tech startups, and one of the first things we do when we talk about their privacy policies is, I wanna note every single way in which you are disclosing that data, right? Everything single way. And I often have conversations about these tracking technologies and these cookies and Wess, I can't tell you the number of times companies say to me, oh, I don't know what a cookie is, or I don't know what this tracking technology is, let me go back and ask my IT person and see what they've put into this app. Mm-hmm . <affirmative> . Um , and that <laugh> , you know, that to me is concerning because this is your app, right? Your company, and you don't know what data you're collecting,

Speaker 2:

Right? Um, I, I, I find that even in my own work, that uh, oftentimes, you know, people have become a little more comfortable with the idea of what a cookie is, but what's a web beacon, or what's a pixel or <laugh> , you know? Right . And we spend a lot of time educating , uh, our, our clients on what those things are and what they mean, because you may not even recognize that they're present in your environment. And even your IT department or function may not have a good handle on it either if they haven't really given , uh, consideration or if they haven't learned the, the parameters of these kinds of, of , uh, ways of moving data from one place to another. So,

Speaker 3:

Oh, that's absolutely right. Wess, and I've had companies that I've worked with even say, well, do we even have to list out the cookies that we're using and the, you know, the tracking technology that we're using? You know, can't we just say, well , we'll figure it out later, right? Or, you know, we're using tracking technology, but we're not gonna disclose which kinds. And, and because of that, right? It really shows on their part a lack of understanding of how they're collecting and disclosing data, and also a lack of interest in learning more and protecting consumer data. And that I can find very concerning, because if you as a company don't know what data you're collecting and sharing, how can your customers trust you? Especially when we're in a world in which, you know, mental health data, reproductive health data, all of that is really at the forefront of these discussions.

Speaker 2:

I , it's funny that , um, you know, as we talk about this subject, I , I'm looking at the notice page for the application we're using right now, and I see privacy and legal policies, I see your privacy choices, and I see cookie preferences. If I haven't taken the time to actually go and click on those things, I don't really know how this particular application is or is not using my data, what cookie preferences I might've set. And I think that that's a common problem , uh, for many people is they, they, they don't take the time to go and read these things. And one of the reasons , uh, is oftentimes that they are written in such , um, difficult to understand terms for the average consumer <laugh>,

Speaker 3:

And they're lengthy, <laugh> ,

Speaker 2:

Lengthy , uh, 47 pages to agree to whether I can play this game on my iPhone, you know, <laugh> , whatever the case might be. And that's just a game. We're not even talking about something that's collecting my data to use for my health.

Speaker 3:

Well, exactly. And there have been studies that show, if you read every privacy policy or terms of use that you ever have agreed to, it's gonna take seven years off your life. And no consumer is, is going to do that. Um, you know, no , I mean, truly, even, even as lawyers, right? We're not gonna spend hours and hours sifting through those privacy policies for things, you know, like a game or whatever that might be lower risk. Um, but I can tell you the average consumer clicks the I accept button without a second thought without even opening the document. Now, there are some companies that have engaged in more, you know, kind of consumer friendly and consumer protective behaviors where they actually require you to open the document and scroll down through it before you're allowed to click. I accept. And while that is, you know, a better practice, the problem is consumers are quickly scrolling, right? It's still not th

Speaker 2:

Through. Yes,

Speaker 3:

Exactly. You're not enhancing the reading or the retention rates of that information, right?

Speaker 2:

So if I had to make a suggestion to the industry about something that would be important to do better in that regard, it would be short , uh, concise and easy to understand. Uh , and if we did that, we might have more people who actually read them. Unfortunately, that butts up against some of the legal requirements that we face.

Speaker 3:

Exactly. And I've even worked with companies to prepare kind of a one page cover sheet, you know, so whenever the consumer opens that privacy policy, it's like, here's exactly what you need to know about how your data's being disclosed. Right? Okay. Now you can go and read through right . The rest of it, how we're collecting it, you know, what cookies are, all of that. Um, but at least you can know upfront how that data will be disclosed downstream. Um, but I can tell you a lot of companies are not doing that.

Speaker 2:

Yeah. I, to me, the , uh, best idea would be to do what O C R did with the notice of privacy practices some years ago, allowing a layered notice. So here's your first page, your one page that hits on , uh, uh, the, the really highest level key elements, and click here if you want to know more. Some consumers will probably do that, others probably won't, but, you know, at least we give them more choice. So if I could advocate for a position, that would probably be one I'd advocate for.

Speaker 3:

Yes, I agree . I think the ones that we have today are too long. They're too lengthy. They're , you know, consumers are not going to understand them at all.

Speaker 2:

Right. And you know, the funny thing is, we didn't start out to talk about notices and cookies <laugh> , but the reality is, is that that ties in very tightly to what we're talking about, and that is the data privacy and security risks in technology enabled healthcare, right? Absolutely. So when we think about , uh, the different kinds of risks, let's start with telehealth communications. Where do you see the real risks, privacy and security risks in the telehealth communications end of all of this ?

Speaker 3:

Absolutely. So when we're talking about telehealth, right? We're, we're usually talking about a company that, and , and this does not apply across all the board, right? But most of the time we're talking about telehealth companies that do have oversight from hipaa, right? They're usually gonna be covered entities, unless you've built a telehealth company that doesn't accept insurance, right? You're not filing any type of standard claims, that type of thing. And so, so just to make that clear, right? We do have kind of most of these telehealth companies falling under hipaa, but there are still a subset if there's not insurance involved, right? That that could be outside of hipaa. When we think about the technologies being used for telehealth, there are stringent standards under HIPAA as to, you know, what you have to do , uh, with respect to the security of the platforms that you're using for audio visual communications. And so I think that we have a better privacy and security structure around those telehealth communications than we do for apps and wearables, which I know we'll talk about shortly, Wess , um mm-hmm . <affirmative> . But that doesn't mean that this is foolproof, right? We still have the potential , um, for cyber hacking, which is something that has really come into play , um, as we've moved into this digital health landscape. And we've started to see, you know, did you know healthcare records themselves have a value of $250 plus on the black market compared to your credit card, right? Which is like $5 and 40 cents, right ? Huge difference there. And so, to the extent that these hackers can hack into the audio and visual software that you're using with your healthcare providers, right? They can get your data, they can hack into your patient portal, right? Your , um, provider system and your , your data can be leaked that way out to the public and out to even law enforcement agencies when we're thinking about reproductive healthcare. And so that's something that's, that's of utmost concern. And it was of utmost concern whenever we had the covid pandemic coming on when all of these providers were switching so quickly to new telehealth modalities that really hadn't been built with privacy and security for health data in mind. And now we've started to see, right? A lot of those companies have implemented, you know, BAAs or other things that they would need right now , um, to be able to process and collect that data and , and share it back with the covered entity. But in the early stages of c o that was all being developed almost as a, you know, we're building the plane at the same time that we're flying it. Um, and so we still have some remnants of that when we think about the privacy and the security protections that are involved with these types of telehealth visits.

Speaker 2:

That term building the plane while flying. It is one that I've heard so frequently in the last three years.

Speaker 3:

Yes.

Speaker 2:

And , and, and you know, what it really denotes is we're trying to keep it afloat while we're trying to make it better. Uh, and, and yeah, early, early in the, in the Covid pandemic, that was certainly a very realistic concern. You know, we saw many of our clients , uh, suddenly shift to sending all of their employees home and then trying to figure out what to do about security controls and VPNs and different ways of accessing and connecting back into the main systems and those sorts of things. So I can imagine if that's what we saw at Clearwater in our , uh, universe of, of clients, that you can expand that across the entire ecosystem very easily and find that pretty much everybody ran into the building the plane while flying it scenario early on just, just trying to manage for the realities that we were suddenly faced with. Uh, I, for 1:00 AM thankful that we're finally back to being able to get on an airplane again and go and see other people and, and meet in the same room. So <laugh>

Speaker 3:

Yes. Yeah. And even when, you know, all of this was going on, I would get so many questions from covered entities, right? Who were building these telehealth platforms about. Well , um, you know, my nurse just texted my doctor , um, you know, and , but it included patient health information and it wasn't over a secured channel because it was a text message. 'cause they're trying to coordinate a schedule, right? Or the emails and just kind of that entire infrastructure when it's taken and decentralized because everybody's working at remote locations, implementing the privacy and security controls for that is very challenging. And so we saw during that time a lot of inadvertent breaches, right? Maybe they only affected one or two people, right? But you have the, you know, the unsanctioned text message that's not through a secured channel or, you know, the, the email that's not properly secured. And so that was very , um, you know, very shocking to kind of see that environment where these are people who understand and know hipaa, but yet you take it and you decentralize it. Um, and then you have all of these new elements and factors coming into play.

Speaker 2:

Yeah. You know, something you just said sparked in me , um, the idea of the unsecured text message. Mm-hmm . Right? Many people just don't understand the reality of s m s and how easy it is to hack , um, attack and, and do things within that particular environment. Um, and so I, I take that idea that many people don't understand just that simple one. And then you expand that across this universe of healthcare wearables and applications that are not regulated in any way, and it just suddenly becomes a much bigger picture. Um, when I think about , uh, specifically around non HIPAA related apps and wearables, there are a lot of common misconceptions. What are some of the ones that come to your mind, first of all?

Speaker 3:

Ooh , the number one is that there , you know, a lot of people using these apps and these wearables think their data's covered by hipaa. It's kind of an automatic assumption. It's not even a question in their mind that HIPAA doesn't apply. Um, and, and why would it be? Right? You're not involved in the day-to-day legal operations of what's going on in healthcare. How would you as a , as a lay consumer understand the difference between a covered entity, a business associate, when that applies and when it doesn't? Mm-hmm . A lot of consumers think because they're taking that same health data that they give to their provider and they're putting it in an app, right? Same exact data that it should be protected the same way. Um, and it's not. And so that , um, when I explain that to consumers, 'cause I do a lot of consumer advocacy, I explain that to them, and it's so hard for them to wrap their minds around why my data's not protected. Because it's a classification of data. Like, why does it matter who holds that data? It should be protected because it's health data, right ? So that's kind of the , the first big misconception that we have. You know, the second big misconception is that there are other limits, you know, whatever they may be that exist on how these companies can use health data. So here's the thing, right? If you're a non HIPAA covered entity, right? You've got an app or a wearable mm-hmm . <affirmative> , we need , the only thing you know, let's say that you don't have a state law, right? 'cause there are state privacy laws that are coming in. So let's just say you're in a state without one of those laws. There's not much else beyond how you tell consumers you're using their data that's restricting your use of their data, right? If you , and here's the thing, right? We've seen a lot of these F T C actions coming through saying, oh, you were unfair or deceptive, right? 'cause you told consumers you weren't gonna use their data this way, and you were using it this way. If they had said in their privacy policies, yeah, we're selling your data right to Facebook and Instagram and Pinterest and on TikTok , and that was fully disclosed in the privacy policies, they wouldn't be in trouble. Um, and that's, that's the thing that a lot of consumers don't understand. It's not that they can't sell your data this way, it's that they said they weren't going to sell it that way. And that's the distinction there is they actually can, if they're being upfront and honest with you about what they're doing with your data. So those are two huge misconceptions I see about the app in the wearable world. Um, and then, you know, kind of the third one is consumers. And I don't know if it's necessarily a misconception or just the way in which we are willing to disclose data, but consumers don't necessarily look or examine the privacy and security practices of the apps that they're using. Um, and they don't really have that at the forefront when they're thinking, oh, cool, I can get this algorithmic prediction. Right? Or I finally have the power to track my own healthcare. That a lot of times outweighs the privacy and security risks that are associated with these applications in the minds of consumers until something like a breach happens.

Speaker 2:

Right. Um, you mentioned a couple of companies earlier , uh, that had got into trouble for their lex privacy and security practices. Would you talk a little bit more about those specific cases if you can?

Speaker 3:

Absolutely. So, so we've seen over the years, FTCs involvement with, you know, the privacy practices of health technology companies increasing and their scrutiny is increasing. Right now, there have been a couple of recent cases , um, in a , in a couple different fields, right? We've seen mental health, we've seen fem tech , um, for those not familiar, that's FEMA health technology. Those have been some of the most prominent cases. Um, so for instance , um, better help , that was one , um, better help for those who aren't familiar. It offers online counseling services. It's owned by Teladoc, right? And what happened here, they actually , um, reached a settlement with the F T C , but the F T C claimed that better help was disclosing data, like email addresses, IP addresses, information collected from health questionnaires , um, that it was selling that data to companies like Facebook and Snapchat for advertising purposes. And it did this over a couple year period from about 2017 to 2020. Mm-hmm. <affirmative> . And so what happened is the F T C investigated , um, they came to a settlement with better help . And, you know, the settlement did things like banning better health from sharing consumers health data with third parties for marketing or advertising purposes. It required better help to pay $7.8 million , um, to the consumers to settle the charges that it was using their data in a way in which it promised not to use it. Um, it also has to put into place a comprehensive privacy program. It has to direct the third parties to whom it disclosed this data unlawfully , um, to delete that data, right? And so, so yes, right ? It got slapped with a huge fine of, you know, the $7.8 million, but the cost that it's gonna take to implement these privacy, you know, comprehensive privacy programs, the cost is gonna take to notify consumers to notify those third parties. Those are huge costs. And so a lot of times companies don't understand that and they think, okay, you know what? We might get slapped with a fine. Yeah, you might get slapped with a fine, but here's the thing, now you've gotta build a comprehensive privacy program. And that is very costly. Versus if you had built it in the beginning and you had these kind of safeguards in place to avoid the investigation, the negative press, you know, the limitations on all future uses of that data. Um, so better help was one, you know , um, the other one that got recently investigated in the mental health space was cerebral. And that one , um, you've probably heard, because it has been in the news for quite some time on a couple of different levels. It has F T C investigations, D O J Investigations, d e a investigations. Um , goodness. Yeah. It's, it's got a lot going on. But when we think about just the privacy side, cerebral was like the love child of, of digital health technology for mental health. Um, it had like a $4 billion valuation. Um, it skyrocketed in 2022, just one of those apps, you know, that , that when people looked at it, they thought, yeah, this is a unicorn app. Um, cerebral was providing things like comprehensive online mental health services for depression and anxiety, P T S D A D H D , bipolar, all of those types of things. Um, and what happened though is it ended up sharing private data on 3.1 million users for years with advertisers and also with social media platforms. Um, so thank , you know, Facebook, Google, TikTok, and there was actually a notice of Breach that was posted on cerebrals website that said that this had been going on since 2019. Um, and then it was actually, what the time that it got disclosed and discovered was the second largest health breach , uh, for 2023. And so Google, or I'm sorry, cerebral was doing things like we had just talked about Wess , um, with the tracking technologies, right ? It was using pixels and the tracking technologies like those that were made available by Google and Meta and TikTok and providing sensitive health data back to those companies. Um, so it was sharing things like names, phone numbers, email addresses, mental health assessments , um, oh goodness. And, and it got worse, right? 'cause if you were a patient who had purchased a subscription plan, your data was also including things like your appointment dates, your treatment, your insurance information, all of that getting shared back through those tracking technologies. Um , and so, and it's not, you know, just limited to mental health apps. This is something we're seeing across health technology right now. I will say mental health , um, and fem tech are kind of the two that have gotten the most publicity for F T C investigations. 'cause on the fem tech side, we had flow and we had Premo doing very similar things with the sharing of data downstream , uh, with advertisers even in , um, PREMOs case with some Chinese companies. Um, so it's, you know, kind of the similar story, right? You're sharing data in ways in which you promise consumers you wouldn't, whether or not that's an intentional breach on the company's part, or , uh, we were talking about earlier, Wess an unintentional breach, right? Because you didn't realize what data you were collecting. Either way, these things usually go on for a couple years before they're caught. And then by that time, right, they've exposed millions of patients data to other companies and, and oftentimes tech giants,

Speaker 2:

Right? Yeah. Um, something I've heard you mention in the past , um, some new studies , um, ranking healthcare, healthcare applications on a theme of data Hungriness. Tell me what that means.

Speaker 3:

Yeah, so, you know, I really, before Covid, we didn't have a way in which consumers could look at the privacy into the security protections of an app and compare it against another application. And even today, we don't have a perfect way to do this, but we have some companies that are starting to do that research and are starting to dive into the privacy and security practices of health tech apps and starting to say which ones have good privacy and security practices, or which ones are collecting more data than they need, or quote unquote data hungry . And so, so we've started to actually get some of that data. Um, two of the companies that are at the forefront of doing this are Mozilla and Surf Shark . Mm-hmm. <affirmative> . So Mozilla actually has done ones , um, studies that are related to mental health applications. Um, and so for instance, they previously reviewed about 32 mental health meditation prayer applications. Um, and this was, this is back in like 2022, and it included, you know, common apps like Calm and Headspace. And of the 32 mental health apps , um, and meditation apps that it reviewed, it actually gave 22 of the apps a privacy not included warning label. Um, which basically was saying that Mozilla had privacy concerns about how those companies were using personal health data. Oh, good.

Speaker 2:

And

Speaker 3:

Then Mozilla actually also released a report , um, earlier this year on popular mental health apps, and it found that almost 60% fell short of minimum standards that Mozilla would be expecting these companies to have. So 60 Okay . Percent <laugh>. Um, so that, that's a lot. Um, you know, there have also been other studies, for instance , um, there were American Medical as , I'm sorry, the Journal of American Medical Association mm-hmm . <affirmative> , um, it published a study that showed that there were about 578 mental health apps that had been reviewed, and 44% of those were sharing data that they were collecting with third parties. Um, and then there was also a very interesting study that was done by Duke University earlier this year, and it actually looked at data brokers and kind of what data they have and what data they're selling. Um, and it found that data brokers are collecting and selling mental health data , um, and they looked at 10 of the most, you know, quote unquote engaged brokers, right? Um, some of the most active ones that are out there. And it found that they were selling highly sensitive mental health data from Americans. Um, and that included data about like a D H D , depression, anxiety, bipolar disorder. They're selling that data so they have it. Um, you know, surf Shark has also done some rankings where it will rank , um, we've seen this in fem tech for instance. It will rank the, you know, most popular period tracking apps on their data hungriness, and it will show you. And oftentimes, right , it's, it's those big name companies like Clue or Flow or Glow that everybody is using that are actually collecting the most data to give you the same insights that a smaller company like Apple tracking , um, that, I'm sorry, apple Cycle tracking , um, that they're giving you the same predictions with less data. Um, so there have been some studies like that, that have come out recently kind of ranking and analyzing these applications that are out there. I will say, you know, most of these companies are collecting and selling data to some extent though,

Speaker 2:

Right? Right. It's, it's gonna continue to be an issue until we really rethink the entire approach of it. And that's a very difficult lift at best. Yeah . So

Speaker 3:

You're absolutely right. Wess and, and truly, you know, we've started to see this in EmTech since the Dobbs decision, a movement where consumers are getting more informed about how apps and wearables are using their data. Mm-hmm . Um , and that's, you know, that kind of led to a mass deletion of period tracking and ovulation and fertility apps right after the Dobbs decision came out. Um, so we're seeing that it's starting to flow into other areas now, like mental health where consumers are getting smart on these issues and realizing the same privacy issues that we have in EmTech or, you know, present throughout the entire health technology field. Um, so we're starting to see that slowly trickle down, but really until consumers understand how much value their data carries and the fact that it's their data, once it's out there, you can't get it back. And until we really shift kind of that data ownership discussion, we're gonna continue to see apps operating like they are now, where they're collecting and selling that data downstream for revenue, and the data gets out. Right. And we have these breaches because we as consumers aren't demanding the changes yet.

Speaker 2:

So if I think about it as a triad, I think about we have the applications and the companies that are gathering data for a purpose, sometimes using selling, sometimes not whatever the case might be there, but they're one leg of the three legged stool, so to speak. Yeah. The second leg then is the consumer becoming knowledgeable and knowing what their first, what their rights are, what their responsibilities are, and how the applications will be using their data. But then there's a third leg to this whole stool, and that's the enforcement leg. So what do you see in the emerging trends area of all of this for enforcement, for technology enabled apps?

Speaker 3:

Great question, Wes. You know, enforcement is, is finally taking center stage. We've seen the F T C be much more involved in examining and regulating the applications that are coming to the market from a privacy perspective. Um, so right now, you know, and the F T C has been clear that it is looking at health technology companies right now for their privacy and their security practices. And it has developed toolkits , um, best practices, right? Guides for what these companies should be implementing to try to stay above review and investigation for, you know , the F T C and for hipaa. And so what we've started to see though is, and as I mentioned before, right, more regulatory investigations for companies that could be misusing consumer health data, we've started to see applications of the health breach notification rule. So that was a rule, right, that we , as we talked about earlier, has been in existence for decades, but was never used. And, you know, part of the reason it was never used was because we didn't have this environment where we had health tech companies creating applications outside of hipaa. And so now that we do this rule has reemerged, it was, you know, first recently rediscussed in the flow F T C investigation when a couple of commissioners who had, you know, joined the opinion but had dissented on a different part, said, we, why aren't we using this rule? This rule is made for this exact circumstance where data has been disclosed without a consumer's knowledge or authorization. If we're not gonna use it for these health tech companies like flow , why do we even have this rule? And so that was in, that was in 2021. And so now we've actually had two actions , um, one of which was against GoodRx. Um, I think Premo is the second one. And so we've had a couple of actions now under this health breach notification rule. And so we've started to see this come into play and the F T C has even issued a notice of proposed rulemaking for the health breach notification rule to essentially cover all types of non HIPAA covered applications and health technology out there. So that's something that's been a pretty big development there. The other thing that we've seen kind of on the HIPAA side is with respect to reproductive health data. Um, so we saw that that, you know, H H SS came out with a notice of proposed rulemaking , um, with respect to how reproductive health data should be safeguarded and how it can be disclosed downstream. Um, and that's, you know, really with respect to HIPAA covered entities. So that, that's interesting because it's kind of a recognition, right? That the Dobbs decision is having an impact on how consumers are willing to share their personal reproductive health data, even with their healthcare providers, not just with applications, but actually with providers who are supposed to be giving them the, you know, medical insights and medical treatments that they need. They've found that this has now been chilling the speech between these, you know, the patients and the providers. And so they've started to try to make some changes to the HIPAA rules in order to make sure that this data can be freely discussed between the parties without fear of disclosure downstream for prosecution purposes. Um, so those are kind of some of the new developments that we've seen in this area. You know , we're of course waiting for, you know , final rules on everything, you know, notice and comment periods to pass. Um, but more and more health tech companies are being investigated and reviewed by the F T C because this is a huge priority for them. Um, they've also made that very clear, and this is something that a lot of health tech companies, you know, not to pick on them, but they're doing wrong. And so this is something if you are a health tech company, right , or you're an attorney for a health tech company, making sure that your clients understand exactly how they're using and collecting and disclosing data and making sure that that is reflected in a living, breathing, privacy policy, that should be your number one goal.

Speaker 2:

So what you've just expressed there then is a fourth leg to our stool, if you will, and , and that is best practices going forward. What are some of the other things that you think about when you think about privacy and security best practices that really should be inculcated throughout our industry?

Speaker 3:

Yeah, you know, the number one thing I always tell founders , um, of health tech companies, no matter what stage they are, even if they come with an idea and they're building a minimum viable product, is have a data map. If you as the data map exactly right. If you as a founder do not understand exactly where you're collecting data from, what data you're collecting, where you're storing it, where you're disclosing it, then there's no way that you're gonna have an accurate privacy policy or even adequate protections at each of those entry and exit points for the data. And so people often get scared when they mention data maps and they say, oh, who can I, who can I hire to build me a data map? Right? <laugh> . And it's, it's a collaborative effort between you as the founder, right? What you're building your product design decisions and your technology company or, or you know, vendor who is actually building the tech for you. And it's gotta be an open dialogue for you all to exactly understand what data you're collecting, where you're storing it, how you're disclosing it. And, and two , it's, you know, also part of the discussion about revenue. Um, a lot of companies think, you know, selling data downstream is an additional source of revenue, and so therefore I'm gonna incorporate it into my platform. Have that discussion, you know, with your co-founders or with your marketing or, you know, business development team about, yes, right? This is an option for a revenue stream, but how are, you know, how are we gonna be losing consumer trust in our product? Is selling their data downstream potentially, and especially in this environment, going to be something that negatively harms our reputation, particularly as consumers are now getting those surveys and those tools to compare the health tech applications on privacy and security. Is this something that could, you know, down the line, have a negative impact on us? And the other thing I also recommend is we live in a world in which data, you know, more is seen as better <laugh> , um, like everything else, right? More is better. And that's not the case when it comes to data founders and tech companies should absolutely be adhering to the minimum necessary data principles because if you're collecting more data than you need, you risk exposing more data than you need, right ? And that's not only bad for the consumers, but it's bad for you as a company, not only from a reputational perspective, but also from a breach perspective because you have now disclosed way more data than you ever needed to have. And I will tell you, especially in the reproductive healthcare environment, what we have seen is that consumers, they're not just, you know, blindly putting in data into your applications if they feel that you're asking for data that is, you know, too sensitive, too personal or not relevant to what you are providing in the app or the wearable, they're gonna put in false data. There have been studies , um, I can't remember if it's the , the Washington Post or the , um, the Washington Journal that have shown that 50% or more of consumers are putting in false data into these applications because they don't want these applications to have that level of sensitive health data. Mm-hmm. <affirmative> . So just because you're collecting this health data doesn't mean that consumers are giving you accurate data. And that can be problematic when we think about some of these companies partnering downstream with research institutions , um, to make kind of longer term healthcare predictions and analysis from that data. Now you have false data going into those long-term research studies. So absolutely data map , um, minimum necessary principle, making sure that you have a living, breathing privacy notice or privacy policy for consumers. A wes I cannot tell you how many times companies come to me and say, I've copied this privacy policy from my competitor because it's the , you know, they're doing the same thing as me. So yeah, I I have a privacy policy. It's copied from them , or, yeah, I had a privacy policy. It was developed by an attorney five years ago and I haven't touched it since. Um , yeah, product is a living, breathing product. Every design decision, every new feature that you add to that product could potentially change how you're using or collecting data or the types of data you're collecting. And that has to be reflected in the privacy policy or else here comes the F T C , right? Because now you have a mismatch between what you're doing and what you're telling consumers you're doing.

Speaker 2:

Yeah. Yeah . I've seen that in the , uh, in the pure healthcare space for a long time. Um, one of the first things I do if I'm doing work with a new client in the healthcare world is I go to their website and I pull down their privacy notice and I read it. And , and even if I go to a new doctor, I get their notice and most of the time they have to hunt it up for me. They say, well, sign here for the privacy notice, but then they don't have it handy, right? <laugh> Yes. Because they just don't understand exactly what they're doing with this notice, which is you're informing me of what you will do, and that gives me choice as the consumer of whether I want to do business with you or not. Right? So the privacy notice cannot be boilerplate. It cannot be something that you just pull down from some other place because , uh, I'll tell you that every time I've seen that happen, I read the notice, I say, that's not what you do.

Speaker 3:

Yes ,

Speaker 2:

You , you , you, you're providing this service, but your notice is this big thing that talks about 43 other things that you don't even do. It's a worthless notice. And that's the point is consumers now, you know, granted we're in this industry where we're living and breathing, you know, healthcare, privacy, security notices, all these things on an everyday basis. But I've noticed that consumers are getting smarter , uh, about taking the time to actually read these notices and to think about what it means to them. Uh, and to think about what, even in healthcare, what's being collected. I had someone come to me a few days ago concerned about the social determinants of health questions that were coming up in an intake with a , with a patient. Uh, the patient was very concerned about some of the types of questions that were being asked. And, you know, how do you answer that? It goes back to that thing I could either lie or, you know, not lie, but perhaps be a little less truthful about what I put in there, or , uh, I can refuse and walk away or I can give them my data and hope that it's not gonna be used for a purpose that I didn't understand or describe. So all of this comes together, you know, the whole mix of this comes together to say that it really is more than a three-legged stool that we have to be thinking about here. We have to think about the responsibility of the application developers and deployers, and the people who are using both creating and those who are then marketing these applications and these wearables and things of that nature. Uh, we have to help consumers be smarter , uh, about what is happening with their data. And we've got to do a better job with our practices around what we do with data once we collect it . Right ? Absolutely. Those things are critical. So I , I think we've hit on a lot of good things here this morning, and I appreciate so very much you taking the time to spend , uh, with me talking about this subject. How would you wrap all this up? Is there anything more you would want to add to the picture?

Speaker 3:

You know, I think I would also say Wess that when Covid hit right, and we had this huge shift into digital health, that was one of the first times in which consumers really had the power to monitor their health at home through these new applications and wearables that were coming. And I think part of it from the consumer perspective was, yeah, we'll willingly trade our data to finally get these insights about our health that we never really had before. Um, and so I think as that trend has progressed, consumers to a certain extent have kind of become desensitized to the privacy and security of these apps and these wearables because they think I'm getting something valuable in return, right? And so therefore, I'm willing to risk my data with this type of application because I want to have the ability to check, you know, my blood pressure or, you know, whatever vital sign or whatever you're tracking on that healthcare application. Um, and so for a long time, consumers willingly made that trade-off with I think the knowledge in the back of their mind that yes, there could be a breach, right? Or a hack or their data could get out. And I think now that we have so many different applications and wearables on the market today, that consumers, as you mentioned before too, are starting to shift and become smarter about it and starting to say, great, I now have 10 of these apps to choose from. Now I'm gonna look at privacy and security because that is something that's important to me. And Wess, I will say for the first time, especially after, you know, the Dobbs decision for reproductive health, I have started to see privacy and security becoming a market differentiator in the health technology space in ways that it never was before. Because we have more options. We have, you know, companies like Mozilla and Surfs Sharkk that are doing the analysis and comparing these applications for consumers. And so I think companies that now start to build privacy and security by design into their product, they will eventually now be able to get some of that market share that before you weren't really able to capture based on privacy and security alone.

Speaker 2:

Yeah, I'm always reminded as, as we wrap this up of one term that I think about when people ask me about privacy and security of an application or the use of the data. And that is if you can't identify what's happening with your data because you were given the application at no cost, you are in fact the product

Speaker 3:

A hundred percent .

Speaker 2:

It's always worth remembering. Bethany, thank you so much. This has been , uh, a really enjoyable time this morning. And , um, I , on behalf of you, myself, and Clearwater , uh, thank you all so much for listening to us today, and I hope you have a great day. Bye now.

Speaker 1:

Thank you for listening. If you enjoy this episode, be sure to subscribe to a H L A speaking of health law wherever you get your podcasts. To learn more about a H L A and the educational resources available to the health law community, visit American health law.org.