Cracking the Cancer Code

You can follow the rules but still be unethical

ITCR Training Network Season 1 Episode 9

Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.

0:00 | 29:43

Send us Fan Mail

In this episode of Cracking the Cancer Code, we discuss critical role of ethics in cancer research data management. Dr. Jody Platt discusses with us how researchers navigate patient consent, data sharing, and privacy protection in an era of rapidly evolving technology. The episode highlights the importance of transparent communication, cultural competency, and the emerging concept of data sovereignty in building ethical research practices.

Note: There are laws to prevent discrimination based on genetic risk by employers and insurance companies, however advances in technology have created emerging potential concerns. See this article https://www.annualreviews.org/content/journals/10.1146/annurev-genom-111119-011436  for more information on the nuances involved.

0:07: You're listening to Cracking the Cancer Code, a podcast series about the researchers who use data to fight cancer. 
 0:14: I'm Doctor Carrie Wright, a senior staff scientist at the Fred Hutchinson Cancer Center, and I lead content development for the ITCR Training Network, a collaborative effort funded by the National Cancer Institute of researchers around the United States aimed at supporting cancer and chromatic. 
 0:29: And data science training. 
 0:31: And I'm Candace Savonen. 
 0:32: I'm a data scientist at the Fred Hutchinson Cancer Center and the tech lead of the ITCR Training Network. 
 0:38: We work closely with a variety of dedicated cancer researchers on the forefront of cancer informatics shaping the field's future. 
 0:45: In our last episode, we re-centered ourselves on the lived experiences of cancer survivors, trying to remember how to not get lost in the nitty gritty of research and the numbers and remember what our real goal is, which is to help people survive and thrive, hopefully after cancer. 
 1:04: But it can be surprisingly easy in all the hubbub of science to forget about the lived experiences of those who are actually trying to help. 
 1:15: When you're working with the identified data and you're just like sitting in a, you know, a computer and you're programming stuff, it's really easy to forget like there are people behind those numbers. 
 1:25: So it's not like abject super negligent behavior to sort of forget about the people behind the work that you're doing. 
 1:32: And researchers are super busy and they're doing all kinds of stuff. 
 1:37: Some of it's not even, you know, you know, it doesn't feel related to your research. 
 1:40: So it's, again, this isn't a blaming kind of activity. 
 1:43: It's like, it's hard to do this, but it's also certainly plausible to try to find ways to talk to patients. 
 1:51: That was Dr. 
 1:51: Joan Platt. 
 1:52: My name is Jody Platt. 
 1:54: I'm an assistant professor in the Department of Learning Health Sciences at the University of Michigan Medical School. 
 2:00: My My background is in public health and I got my PhD from the School of Public Health at the University of Michigan in Health Services organization and Policy, which is basically like health services research. 
 2:12: We talked to her to learn more about the role of ethics in research. 
 2:16: The elevator version of how I describe my work is that it lies at the intersection of ethics and informatics and policies. 
 2:24: So ethics is usually like what should we be doing, policy being like what do we do? 
 2:29: And then informatics is the subject area. 
 2:33: The way I got involved in this was after I got my master's in public health, I started to work with a public health geneticist who had a project with our state health department in the state of Michigan where they wanted to use newborn screening blood spots and the data that goes along with that. 
 2:49: So that means like information about where babies are born, where their mother is, basic demographic information that can be connected to other health information. 
 2:58: It initially it's really just used for newborn screening and it's a long-standing public health program. 
 3:03: The state of Michigan and several other states 20 years ago now, sort of decided this is a really rich and amazing repository for data. 
 3:13: And so that's sort of for me like my case zero and like big data and ethics and like shifting norms and shifting policy and so they wanted to know how they could use this research, these blood spots and the related data for research when that wasn't the initial intention of data collection and that's how I got started. 
 3:33: And this episode, we will discuss patient data from the patient's perspective and hopefully answer some common questions patients may have about their data. 
 3:40: So Carrie, can you tell us a bit about why is ethics so important and what is it actually? 
 3:46: That's a great question, Candace. 
 3:48: I think it's something that's overlooked a lot and actually the biomedical field does a pretty good job and we'll talk about that with Jordan, but, you know, there's been some historical, really bad things that have happened in research and that's caused a lot of mistrust and damage. 
 4:03: In terms of the relationships that we have with communities and to really help people, we need people to trust us and we need to be doing the right thing. 
 4:14: Obviously, there's people out there that are advocating for patients' rights and to make sure that research is happening ethically. 
 4:21: But it could also be helpful for any of us as individuals to know about how these things work and how consent should work, how research should work, and so that's where Carrie, it would be great if we can kind of dive in. 
 4:33: In how consent in a research should happen. 
 4:37: Yeah, absolutely. 
 4:39: So, consent is a bit more complicated than I think people often realize. 
 4:44: And so the idea for this concept is that when people participate in research, they understand the potential risks involved and they understand possible risks that maybe are less likely, but possible in the future, and I feel like that aspect is much more important. 
 5:03: Today, because now, the way in which our data gets analyzed, we're using machine learning, AI, you know, all of these really fancy algorithms that can actually re-identify, meaning link the person's identity back to their data in a way that we never could before. 
 5:23: And so we have to explain to patients when they're in a trial, when we collect their data that there is a potential for Someone to identify them as being in this trial in the future. 
 5:37: Depending on what kind of data we're getting from someone that is more or less likely. 
 5:42: So certainly certain forms of genetic data are really like a personal signature, a unique signature to that individual. 
 5:49: And if they're in multiple studies, you know, you could potentially identify that person as being a participant in various studies, and that can be problematic actually. 
 5:59: And the reason that's problematic is because if It's identified by an employer that someone has a particular genetic variant, which makes them especially prone to cancer. 
 6:11: They're going to potentially have multiple types of cancer or they might have a high risk for a neurodegenerative disorder or something like this. 
 6:18: Someone could discriminate against them and not choose to employ them or to continue to employ them because they're worried about their ability to do their job. 
 6:28: Secondly, there's issues with insurance and the way that healthcare works. 
 6:32: In the United States, we need insurance to help us pay for our medical bills. 
 6:36: Insurance can charge higher premiums potentially if they know that you are at high risk for various illnesses. 
 6:43: That's currently the case and and that's a problem. 
 6:46: Carrie, can you tell us a bit about how this relates to data sharing? 
 6:50: Because that was something we talked a lot about in previous episodes, how about data sharing is the right thing, but what we're just talking about is that it is tricky if that data is shared in the wrong way or in the wrong hands, it could be applied in In a way that's harmful to somebody. 
 7:03: So how do we balance these needs of being ethical and sharing data when it needs to be shared, but like not sharing it when it shouldn't be shared? 
 7:12: Absolutely. 
 7:12: So it is indeed way more challenging to figure out how to deal with consent nowadays because typically, data is used more than once from when it is collected and so that causes obviously some more issues in terms of trying to get people to consent to their data. 
 7:32: Collection because you need to explain, well, the data may be used for these other purposes, or, you know, how should we set this up so that your data is used in the way that you're comfortable with. 
 7:44: And so, there are several different ways that people have tried to deal with this issue. 
 7:51: I'm going to describe a few methods that have been useful. 
 7:54: So, one is called blanket consent, which is where a subject agrees that their data can be used for any purpose. 
 8:01: That the person who holds the data agrees is reasonable. 
 8:05: Obviously, this is the most permissible and not everyone would be comfortable with this. 
 8:09: However, there are benefits to this because the participants don't have to really do anything and they can have their data benefit society as much as possible, ideally, if it's being used well, of course. 
 8:21: Next is broad consent, which is where subjects agree that their data can be used for a set of specified purposes and so this is more protective and restrictive of The data. 
 8:32: So for example, you can use my data again, but you can only use it for diabetes research. 
 8:37: You can't use it for any other purpose. 
 8:39: And then there's dynamic consent, which is where subjects are asked case by case, someone calls them up and says, hey, we have this new study, we'd like to use your data for this other purpose, are we allowed to use it? 
 8:52: You can consent, yes or no, you can decide you don't want to participate or not. 
 8:57: And then finally, there's meta consent, which is where a subject would have the option. 
 9:01: of these other three that I described, which is really cool, but obviously more complicated. 
 9:07: I'm just thinking of myself if if I were a participant in research and I was given these options, right? 
 9:13: Like let's say I was given medical consent, so it meant I could choose between blanket consent, broad consent, or dynamic consent. 
 9:20: I think it would really come down to me how much I trusted the institution that was doing the research or maybe even how I guess passionate I was about the cause. 
 9:30: It almost comes. 
 9:31: down to like a little bit of a relationship between the individual participating as well as the people carrying out the research. 
 9:38: Absolutely, yes. 
 9:39: It comes down to relationships and trusts for sure. 
 9:42: As a quick example, I don't know, this is probably like 12 years ago or something at this point. 
 9:47: We'll say 10 years ago. 
 9:48: I filled out one of those 23andmes because at the time, I felt pretty good about it, right? 
 9:53: And they made it very clear about how you can do all this and then they would, they would consent. 
 9:56: But obviously, if you've heard anything about 23andMe or read about it, some things. 
 10:00: Have changed over time about their relationship to the customers. 
 10:05: So as me being one of them about how they've collected that data, what they can do with that data, and it's been a little bit of a moving target, right? 
 10:12: Because as technology changes, some of these ethics things also changed. 
 10:16: So all that to say, I felt comfortable 10 years ago and nowadays I do not feel comfortable so I actually revoked my data from 23andMe. 
 10:25: And again, this is all personal decision, just saying that you should look into it for yourself and read about things and Now that you know some of this information that Carrie's told you about with consent, it might help you kind of make some of those decisions for yourself. 
 10:37: Yeah, I think that's a really good point. 
 10:38: This is a personal decision, you know, it's up to you to decide how you would like your data to be used and that's totally valid if you don't want it to be used. 
 10:48: You know, we've talked about the benefits of data sharing and how that can help advance research and, you know, as researchers, we're generally for research, but there are instances where If you don't feel comfortable with the research team, it's totally reasonable to say, you know, something feels off here, I don't want to share my data. 
 11:09: The researchers should be comfortable and confident in talking to you about this. 
 11:14: They should have an established plan for what they're going to do with the data. 
 11:18: Currently, for NIH grants, we have to propose that we're going to share data and we have to see how we're going to do it when we apply for funding. 
 11:27: So they should have a plan in place and They should be able to share those details with you. 
 11:32: There's really no secret about how that's going to be done. 
 11:35: So they should be able to share the extra measures of security and privacy practices that they're trying to use to protect your data. 
 11:43: So definitely feel comfortable in asking those questions. 
 11:48: It's very appropriate to ask those if the research team seems like they're not quite sure what's happening with that, that might be an instance where you might think,, I don't know, this looks a little bit. 
 12:00: Precarious to me. 
 12:01: Joan goes into further detail about that bioethics as a field actually formed as a result of some of these atrocities that we're talking about in this episode. 
 12:11: Bioethics formed because it was so bad. 
 12:14: That's really where it was rooted. 
 12:15: People were being essentially experimented on without their permission in a number of contexts, and we were experimenting on people that are already vulnerable, so poor people, people in prisons. 
 12:28: Women, black people, like, I mean, it was just like, you know, anybody who's currently stigmatized and it has been dehumanized along the way became an immediate research subject. 
 12:37: I think the other place where people sometimes get tied up with ethics is that we people tend to think of it as mostly about compliance and like, what are the rules we need to follow and how do we work with our IRB or get around the IRB or what, you know, whatever it is, but it's about the IRB and that's seen as a hassle. 
 12:54: One thing that I will tell people sometimes with Respect to ethics is that we need it because we need to be able to go faster further. 
 13:02: So, if you think about ethics, it's like building seatbelts in cars, we, because we have seatbelts in cars, we can build faster cars and not die every time, you know, something goes a little bit wrong. 
 13:13: Diving in then, what are some of the specific events that have happened in the past? 
 13:18: A research group at Arizona State University had collected data from individuals from the Havasu. 
 13:24: tribe to study risk for type 2 diabetes and this was in 1989. 
 13:31: So, these samples were initially collected specifically to study type 2 diabetes. 
 13:37: That was all that was discussed with the participants, but later on, the same samples were then used for other genetic studies and for other uses. 
 13:46: So these were involving schizophrenia risk, ethnic migration and population. 
 13:52: In breeding and all of these particular topics were not things that the tribe wanted research to be done on because they felt like these were all stigmatizing and they did not consent to their data being used for this type of research. 
 14:09: And so they filed a lawsuit in 2004 against the researchers and against ASU Board of Regents. 
 14:16: I feel like this story not only shows the importance of communication, but also of As you said, cultural differences. 
 14:23: I actually have some friends of mine who study medical anthropology and do all kinds of research in that realm of like how culture and stuff relates to medicine. 
 14:33: It feels like in that instance, they probably needed somebody like that on their staff. 
 14:38: Absolutely. 
 14:39: Realizing that these usages would be stigmatizing, would have been really valuable for the researchers to know,, but they also should have been more careful in trying to explain what the potential uses might be for their data. 
 14:55: So, they actually did have consent documents that asked for the data to be used for behavioral and medical problems, but you know, language was a challenge for their participants, so it may not have been described very well that this would be used for other purposes besides diabetes. 
 15:14: Ultimately, what happened though was that the I banned all ASU researchers from their reservation and decided that they didn't want to do any research with them, which is very understandable and a really sad outcome because it means that other research opportunities to help this population are not possible because the ASU people just, you know, didn't do this as carefully as they should have. 
 15:41: And certainly that's not the first time that we've seen trust broken. 
 15:45: Between researchers and a certain community and then had that kind of set things back as far as actually the good that could be done through research because of some poorly conducted and harmfully done research. 
 15:59: So this could have been done in a much more careful way, with more communication, everyone might have felt much better about this, and then the research for schizophrenia and migration and all of that wouldn't have happened, but yet, Future additional diabetes research could have happened, right? 
 16:17: Or maybe it could have been if things were communicated properly. 
 16:21: That's true. 
 16:22: I think feeling blindsided never makes people feel comfortable. 
 16:25: Jody talked to us about the source of many ethical blunders is actually miscommunications and misunderstandings. 
 16:32: I mean, I think some of the blunders and informatics are also sort of related to this disconnect between people kind of carrying on their normal business and then what Society might expect of them. 
 16:43: Nobody's setting out to like be unethical. 
 16:45: They're not like, oh, I don't want to do this. 
 16:48: Another place where in healthcare we can get a little tricky because we can, can follow the rules and still be unethical, and that can be a place where people don't expect to get in trouble, but they do. 
 16:57: So there have been blunders where researchers at Columbia University had sort of an institutional conflict of interest where they're spinning off companies and sort of making money from publicly funded research essentially. 
 17:09: And that wasn't, they didn't declare it, so there's they didn't necessarily think about the conflict of interest ahead of time, so they didn't say anything about it, and then they got in trouble. 
 17:18: There have also been examples where there's data sharing that may be happening with commercial companies like Google, and I think it was the University of Chicago, where they were looking at X-rays, doing image research, trying to create databases and algorithms around that, and essentially sharing what people perceived as personal information with private companies without permission. 
 17:38: That's a case where they ended up settling out of court. 
 17:41: It's unclear whether anything illegal actually happens, but again, people didn't expect that sort of relationship to be appropriate, and so there was misunderstanding about what people think is ethical versus where the enterprise was moving. 
 17:54: There are a lot of blunders, but a lot of them come down to this like this match of expectation. 
 17:59: So, what's interesting about this particular example with ASU is they had gone through IRB, which is the internal review board,, which is basically each research institute has one, and they're trying to make sure that research is done appropriately and ethically. 
 18:18: And so, they had gone to the IRB and they had gotten approval because the consent documents that they had had actually basically said that they could study, it was in quote, behavior and medical problems. 
 18:35: So, according to the written document, they were following the rules and doing the right thing, but in spoken word, it's unclear how well they described the consent documents to the original participants. 
 18:51: And so, you know, this is a good example of people trying to do the right thing, but maybe not being aware. 
 18:59: Of the concerns of that particular population and not being perfectly clear about how to actually get the appropriate consent, you know, long term. 
 19:11: And again, like, you know, we haven't been sharing and reusing data in the history of biomedical research for that long. 
 19:18: So this is another case of like, you know, we're we're trying to figure out how to do the next thing. 
 19:24: And learning how to share data is, is a little more complicated in terms of consent. 
 19:30: Yeah, and if you remember when we talked to Anna Green, she was talking about how funding institutions do try to put up requirements and guidelines that are attached to the funding. 
 19:40: So there are things, whether from funding institutions or laws or legal things in place to try to encourage folks to act ethically in their research and to do the right thing, but it just shows that there's two things in my mind right now. 
 19:53: One is that sometimes like you're saying, people just don't know how to do it right, and the laws aren't enough to like. 
 20:00: You within the bounds. 
 20:01: Think of it like when you're bowling and they have the gutter. 
 20:04: What do they call it? 
 20:04: The bumpers. 
 20:05: You can still get zero pins with the bumpers up, right? 
 20:09: It's the same thing with laws. 
 20:10: They will help you maybe not get out of bound completely, but you could still accidentally commit a really horrible ethical blunder anyway. 
 20:18: But then the other part of it is that, yeah, then there are unfortunately some people who are a bit malicious who kind of try to weasel through the laws in a bad way. 
 20:27: But like you said, it's mostly Actually, it's like the first scenario in which there's people who just don't know and the laws are constantly trying to catch up with where technology is moving and that's the other part of this equation and we talked a lot to Jordan about that. 
 20:43: So I don't know if you wanted to talk about that, Carrie, about just that the ethics and legal implications have to keep up also with the technology that is ever changing. 
 20:51: Yeah, I think another really critical and important piece there too is that it has to keep up with the culture. 
 20:59: around technology, which is also dynamic. 
 21:02: Our thoughts about AI have dramatically changed in the last few years, and it's really important to keep pulse on the public and what the public thinks and more importantly, the population that you're studying. 
 21:14: So in this particular example with ASU, I think something that was really missing that could have helped is a cultural competency expert. 
 21:22: So somebody who understood more about that culture that could have helped mitigate some of the issues that could have happened here. 
 21:30: So I think it really comes down to me thinking that like we need more people who are sort of ahead of the curve, like sitting there trying to think what could happen, how should we maybe prevent these issues from happening instead of leaving it all to the researchers who are trying to do a lot at once, you know, trying to get all the grants, trying to do the research, trying to analyze the data. 
 21:52: It's, it's a challenge. 
 21:53: So we need a lot more attention, I think from People like Joan and others who are really thoughtful and considerate of what are the issues that are emerging and coming and how can we best to deal with them now. 
 22:08: So, I guess I have a question for you, Carrie, and hopefully I'm not putting you on the spot too much. 
 22:13: You told us about a number of ethical blunders that have happened in the past. 
 22:17: And we've also talked about how difficult it is, as technology is changing to keep up with that. 
 22:22: So it feels like inevitably there will be Some things that we get wrong. 
 22:27: I guess the second question is, how do we rectify those situations when those blunders do happen? 
 22:33: And do you have any particular examples of what people have done to kind of try to have some kind of recourse for those situations? 
 22:41: Yeah, that's such an interesting question. 
 22:44: I think this goes back to, I don't know, some of the training I've had for working with the community, which is if you're doing anything that's of value really that is going to have impact. 
 22:55: then there's always a risk that you're going to do something wrong. 
 22:59: And so one of the major things that that really helps is is actually acknowledging when you make a mistake. 
 23:05: There are some really great examples of researchers who have redacted their work. 
 23:10: They've said, I made a mistake. 
 23:12: I've been studying this for 20 years in this way with this method, and I've realized that this method has a problem, and I really respect people like that who can sort of separate themselves from the. 
 23:24: Research and for them it's really about doing the right work and getting the right information out of our research. 
 23:32: So I think the first step is acknowledgement and owning up to it. 
 23:36: And then the second step is trying to come up with better practices that especially involve the research participants. 
 23:45: I think the more that we talk to advocates, talk to the research participants themselves, try to include them in the process. 
 23:53: And determining how the process should be. 
 23:56: This is called data sovereignty. 
 23:57: There's a concept where people who are participating in the research have more say or agency over how the data is collected, where the data is stored, that sort of thing. 
 24:07: And of course like that has to be a collaborative dialogue, right, because they might not know everything about servers and clouds and data security, right? 
 24:15: But there should be a lot of communication and of course we can't do too too much, right, because we don't want to overburden everybody but I think moving in that direction of data sovereignty would be really powerful. 
 24:27: I'm new to this term of data sovereignty. 
 24:29: Can you succinctly kind of remind us what that means exactly? 
 24:33: Yeah, so it's actually a really complicated term and has many definitions, so it's a challenging question to answer. 
 24:39: In some cases, it means that the data is subject to the laws of wherever it was generated or collected, but it can also mean more of like the concept that I was sort of describing here. 
 24:51: Of the people who are participating in the research, who you're getting data from, having agency and say over how the data is collected and used. 
 25:02: So one of the illuminating things from talking to Jody is that she believes that ethics aren't something that's constraining us, but instead, something that helps us progress forward in a way that's really impactful to the people we're trying to help. 
 25:18: So I think what we're gonna need to find and sort of Try to, again, these are kind of research questions, but an opportunity for innovating, how do you respect people's autonomy? 
 25:28: What kind of choices are you really giving? 
 25:29: How do you ensure that you're maximizing benefits and minimizing harms and being transparent about those? 
 25:36: I think we're gonna need to find ways of sort of ongoing engagement because you can't exercise any of those rights or preferences if you don't even know what's happening. 
 25:45: So I see things moving more towards like a transparency model. 
 25:50: Model of maybe consumer protection. 
 25:52: So when things go wrong, there's recourse that people have because you're right, like consent is broken at the moment for this context. 
 26:00: I mean, another sort of hot word right now is governance, right? 
 26:04: That so people are really thinking a lot about what's the process of accessing data and what kind of credentials do people have in order to use data and what happens if there is a data breach, what are the consequences or the steps that are taken afterwards, so. 
 26:19: I mean, I think that's sort of where and how we're getting there. 
 26:22: Ideally, you know, a health system or a research institute or a university is looking to not only sort of their experts in compliance and research and informatics, but that they're also hopefully doing some outreach and engagement with patients to understand expectations and what they think is appropriate. 
 26:42: We've talked pretty extensively about data sharing in our previous episodes and now, and some of this. 
 26:48: along with that data isn't necessarily collected for a particular purpose. 
 26:53: Jody talks to us about how that changes the ethics process due to the fact that we sort of no longer have purpose-driven data collection, or maybe it's initially purpose-driven, but then it becomes we can use it for a lot of other things and so we may as well keep it. 
 27:08: So yeah, data can last a very long time and we don't have clear policies for this sort of like we're just gonna to collect all the data and then figure it out later. 
 27:17: Which again, becomes problematic when you're trying to be ethical and trying to respect people's autonomy and right to withdraw from a research study if they don't want to participate, which is foundational to informed consent, informed consent is foundational to research, right? 
 27:31: You tell people what the risks and benefits are going to be, you enroll them in a study, the study ends and you destroy the data or you hold on to it for a certain amount of time and then destroy it. 
 27:41: But anyway, there was a time when that process was much clearer than it is now. 
 27:54: The one thing that feels like slightly missing from this episode is that unfortunately a lot of the unethical things that have happened, I mean we talked about this a little bit with the vulnerable populations, but the thing that's still happening in such a bad way is health disparities and that's unethical. 
 28:13: So in our last episode, we talked to patient advocates. 
 28:17: Who echoed a lot of what we have been talking about, which is, you know, wherever possible, it's important to talk to research participants and incorporate their needs and lived experiences. 
 28:27: We've had some very powerful conversations with folks about health disparity problems within cancer and it'll be very, I think, powerful and moving for folks to hear this next episode. 
 28:36: In our next episode, we're going to talk about something that Didn't really cover in this episode, but it is so critical, which is a piece of research that's really actually quite unethical at this point, which is that a lot of the advances that we've made are not equally shared among the people that could benefit from those advances. 
 28:55: And so in our next episode we're going to talk about health disparities and how having good treatments is only one part of the battle of cancer. 
 29:03: Thank you for listening to Cracking the Cancer Code. 
 29:06: This podcast is sponsored by the National Cancer Institute through the Informatics Technology for Cancer Research Program, grant number UE5CA254170. 
 29:16: The views expressed in this podcast do not reflect those of our funders or employers. 
 29:21: We'd like to thank Doctor Jordan Plat. 
 29:23: for talking with us about research ethics. 
 29:25: If you are interested in more content about biomedical ethics or research ethics, please check our website at ITCRtraining.org. 
 29:34: You can find lots of content about ethical data handling for cancer research, as well as other practices related like reproducibility.