Privacy is the New Celebrity

Michal Kosinski on the Consensuality of Data and the Responsibility of Big Tech - Ep 3

July 28, 2021 MobileCoin Season 1 Episode 3
Privacy is the New Celebrity
Michal Kosinski on the Consensuality of Data and the Responsibility of Big Tech - Ep 3
Show Notes Transcript

In episode 3, MobileCoin Director of Marketing Lucy Kind interviews Michal Kosinski, a professor of psychology researching artificial intelligence and the risks it poses to privacy. Michal tells us about what action he took against Cambridge Analytica when it became synonymous with the destruction of privacy. He also explains how facial recognition software can detect sexual and political orientation, threatening the consensuality of personal data and our ability to control our identities in a digital age. Michal and Lucy debate who should ultimately be responsible for regulating Big Tech.

[00:10] - Speaker 2
Hi.  My name is Lucy Kind, director of marketing at MobileCoin. And we're back with another episode of Privacy is The New Celebrity, where we sit down with some of the smartest people we know for conversations at the intersection of tech and privacy. Today, I'm thrilled to have Michal Kosinski on the line. Michal is a professor of organizational behavior at Stanford Graduate School of Business, and his research on personal data collection has had an enormous impact on what we know about how big tech exploits personal data. His work has literally changed the game in terms of how we think about digital privacy in our lives.

[00:51] - Speaker 1
Michal, thank you so much for joining us on Privacy is the New Celebrity.

[00:56] - Speaker 2
Hi, Lucy. Thank you so much for having me.

[00:59] - Speaker 1
So tell me a bit about yourself.

[01:00] - Speaker 2
Wow. That's a very open ended question. I guess. On the professional side, I'm a psychologist by training, and I'm mostly interested in studying people in the digital environment and trying to understand our behavior a bit more through the lenses of digital footprints that we all live behind when using computers and smartphones and digital platforms and services.

[01:25] - Speaker 1
What sparks you to get into psychology and specifically, psychology of an online nature?

[01:31] - Speaker 2
Well, it's all an accident. I was planning to be a mathematician. My parents were dreaming of me being a lawyer, and I just became this black sheep in the family and became a psychologist. 

[01:44] - Speaker 1
Nice. What was the first thing that you applied yourself to in psychology?

[01:48] - Speaker 2
Psychometrics. I did my first master's in psychometrics, which is the science of psychological measurement. It's science of trying to understand people as well as possible through the lenses of tests and questionnaires and interviews. And it's actually very close to what I'm doing today, which is trying to understand people better, but not through the lenses of questioners, but through the lenses of digital footprints that we all leave behind.

[02:17] - Speaker 1
Amazing. Going back to privacy. When did you first realize that privacy was important to you?

[02:24] - Speaker 2
Well, I'm raised in Europe, and I think Europeans are very careful when it comes to privacy, and it's a big topic of conversations. And governments are extremely cautious not to cross certain lines here. So I think essentially, people are much more aware, thanks to our wonderful European history of Stasi and KGB and Second World War, I think essentially, people are very that's one of the very important values. So I think coming from Europe, it's kind of pretty natural to care about privacy. Interestingly, I must say that the longer I study Privacy, the longer I think about it, the more I'm convinced that in all practical terms, it's just gone now. I think the sooner we accept that the privacy is gone, that a motivated third party can, with some effort, but often with very little effort, know virtually anything they want about us. The sooner we can move on to have a discussion about how to potentially avail this problem.

[03:33] - Speaker 1
Yeah, definitely. And touching upon that subject, I hear you have a history involving Cambridge and analytics. Do you want to tell us a bit about it?

[03:43] - Speaker 2
Well, my history is that I studied at Cambridge University and there was a company called Cambridge Analytica. So now everyone obviously thinks that I must have worked for them, which is not true. In fact, they offered me a job at the very early stage when I think even they didn't know exactly what they were doing and for whom they were working, but for different reasons, maybe because I was so stupid, I said no. Well, it turns out that it was a very lucky no, knowing what we know today.

[04:19] - Speaker 2
And my other connection with Cambridge Analytica is that some of the people in my orbit were working on this project, some people that I knew at the time. And so I knew a few things about them before the general public learned about it. And because I was quite outraged about what I learned, I decided to do a bit of a private investigation and then publish the results together with some journalists from The Guardian in The Guardian. I think it was in 2015 or early 2016. So it was the first article reporting on the unethical behavior and privacy and fading activities of Cambridge Analytica.

[05:04] - Speaker 1
When you talk about the ethics of privacy, it sounds like you fall on one side of this whole debacle, right. In some ways, one would say when Big Tech has our data, they can better serve us with what they think we want. What is your opinion on that on data and about the consensuality of data?

[05:27] - Speaker 2
Well, I agree to some extent with a statement that data can be used to our great advantage. And I think it's probably clear to most of the people that use technology today that there are many wonderful things you can do by looking at people's digital footprints. You can help them to navigate in geographical environments such as Google Maps, using Google apps, which is great for individuals. We can get to places more quickly, but it's also good for societies and environment. We save time, we use less fuel and so on.

[06:04] - Speaker 2
And there are so many other applications from entertainment where we can be served the movie that we have never thought of watching otherwise or a book that we have never thought of reading to areas such as healthcare, where big data analytics has clearly led to some breakthroughs and improvements in how we live in our wellbeing and health and longevity and so on. But of course, as a Privacy researcher, I'm also acutely aware of how the same algorithms are used to recommend us better movies or choose a medicine that will solve our problems more effectively and much more quickly than alternatives.

[06:47] - Speaker 2
The same algorithms can be used to convince us to believe in some conspiracy theories or not go to votes to cast our votes. And so on. So essentially, like with many other technologies, algorithms are today used for the betterment of the humankind, but also to our disadvantage.

[07:08] - Speaker 1
Yeah, definitely. I would even venture to say that these algorithms are a tool.

[07:12] - Speaker 2

[07:13] - Speaker 1
And it depends on how one is using the tool.

[07:16] - Speaker 2
Very much so. And people very often use the example of Cambridge Analytics and how they influence voters in the two recent elections or one recent election. And people keep forgetting that politicians on both sides of political spectrum are using those technologies to communicate with voters. Hillary Clinton out spent Donald Trump three to one buying services of companies like Cambridge Analytica, probably better, and then Cambridge Analytica ads influencing voters and using algorithms to understand voters better. And I think this example very nicely illustrates the issue here that if you have someone with good values and some ethics, they can use those technologies to communicate with voters, better to make voters more engaged and better informed.

[08:08] - Speaker 2
And I think that everyone would agree that in the democracy, engaged and well informed voters are essentially a great thing to have. But other politicians can use exactly the same technologies to spread conspiracy theories and hateful messages. And so on. Very often. In fact, how you judge how this technology is used depends on your own values and points of view. For some people, one person's propaganda might be another person's education, and very often it's just a point of view.

[08:41] - Speaker 1
So how do you suggest that we, as regular people, become aware of when we're being manipulated by technology? And how do we protect ourselves from this?

[08:53] - Speaker 2
Well, I think the solution is probably not on the individual level, but on the societal level, we can close our social media account and stop using popular messaging platforms and stop using popular search engines. But there's a limit to how much we can escape the new digital environment, and especially for people who are not privileged. If you are privileged enough to have someone print your emails and do the banking for you, and so on, maybe you don't need to touch a smartphone, but a single matter, working two or three jobs has no choice but to use online banking and Google maps to navigate around places and what's up to communicate with people.

[09:35] - Speaker 2
So there's essentially no escape. And just closing Facebook doesn't matter. You still are going to use your payment, your credit card, or travel through public transport where your ticket is going to be scammed. So my point would be let's participate in those environments and try to make them better by collective action and by voting for politicians who maybe don't have all of the solutions because no one really has but are motivated to hire people and support people who have solutions. And also let's choose products of companies that seem to be respecting our Privacy more rather than those who just don't care about it.

[10:15] - Speaker 2
And in this way, vote with our money. Essentially but on individual level, we kind of can't do really much apart from educating ourselves and supporting right collective actions.

[10:27] - Speaker 1
Yeah, definitely a top down approach seems really key in creating change on a larger level. And what you said about Privacy being a privilege for certain folks is something we've thought a lot about over here at Mobile Coin. We think Privacy is a right we all deserve. We've talked a lot about the different ways that algorithms and other digital tools for connection can be used, both in terms of consensual and non consensual usage. What do you think are the biggest threats right now to Privacy that we need to be looking out for?

[11:06] - Speaker 2
Well, I think that the threats to Privacy are fairly similar to the ones that we're facing five or ten years ago. Of course, we are leaving more digital footprints behind. And of course, algorithms are getting better at converting those digital footprints into very accurate predictions of our future behavior or intimate traits ranging from political views to sexual orientation. And of course, now losing Privacy of those traits, such as political views or sexual orientation can be literally a matter of life and death in many places for many people in America, in many places, maybe not so much anymore, fortunately, but think about countries such as Russia, Saudi Arabia, or even some more conservative and more homophobic communities versus parts of America.

[11:59] - Speaker 2
We're also losing the Privacy, for example, of sex orientation can end very badly for an individual. So I think the threats here are roughly the same. What I'm seeing perhaps is the biggest challenge now is to look for good solutions while avoiding total panic. And the total panic, unfortunately, can end up with people just doing silly things like escaping their digital environments entirely and refusing to use modern technology, which is bad for the individual. You not being on social media is just disconnecting you from whether discourse is happening about many issues.

[12:42] - Speaker 2
Now it's disconnecting with your friends, but it's also disconnecting your friends with you. And the society in a meta scale is losing. Each time someone is trying to essentially distance themselves from new technology, I think we should try to change it from within as users well informed and educated users that are using those platforms, educating each other while demanding more protection of their Privacy. And also one other observation that I have. And it comes from a person that for many years believed that if we just do the right thing and if we just introduce right Privacy, protecting technologies like mobile phone is doing, for example, we could potentially go back to times where our Privacy was more of a right guaranteed to most of us.

[13:34] - Speaker 2
And the more I study this subject, the more I'm convinced that unfortunately, this is probably not going to happen. And of course, I'm a big proponent of Privacy protecting technologies, but I don't think that it's a bit like with fire safety I'm all for fire safety, but fires will keep happening from time to time, whether you want it or not. And I think the trick here is to try to design the society, design our legal environments, design our interpersonal relationships in such a way as to make sure that even if this fire, if this tornado of losing one's Privacy happens to you or someone you love or someone in the society, that even in such case, this person would still have a privilege of a happy and safe life.

[14:21] - Speaker 2

[14:22] - Speaker 1
So many interesting topics that you've just mentioned. I want to really explore this notion of discrimination via big data, and I know that you had some research recently that touches upon identification of sexual orientation via facial features. Would you like to tell us a bit about it, and perhaps the motivation behind conducting such research?

[14:48] - Speaker 2
Sure, of course. So we are surrounded now by cameras and in our phones, but also CCTV cameras all around on the streets and in office buildings and so on. And our facial images are constantly being uploaded to various databases, whether it's Instagram or Facebook or it's your own Google Photos folder, or it's some governmental or corporate database where your images are being stored. Now, those images are being analyzed using facial recognition algorithms. And those facial recognition algorithms are essentially sorting images into clusters. And then, of course, the main purpose, the main use of facial recognition algorithm is to identify the same person across different images.

[15:33] - Speaker 2
So essentially, if you're being spotted by camera, then the algorithm checks what is the other person in our database that this person is most similar to? And if they identify your face in their database to say, oh, my God. Okay, we found you across those two different facial images. They can identify an individual, but we are not only similar to ourselves, but we are also similar to people like us. An increasing number of governments and companies and startups are employing those facial recognition algorithms to identify not only people, but also identify their intimate traits by essentially using the following logic that if your face is more similar to faces of Liberals than conservative people, then you're more likely to be Liberal.

[16:22] - Speaker 2
And it turns out, unfortunately, that this logic works, meaning they've just given a picture anything about you not having any backgrounds about who you are when you're coming from just having your facial image. Those facial recognition algorithms are built into your phone can recognize your intimate traits, ranging from obvious things such as age and gender and emotion to things that people find counterintuitive, like political orientation or sexual orientation. And in my recent studies, I sincerely tried to make people aware that, hey, look, those widespread algorithms that are being used by startups and governments to identify not only you, but also your traits can very accurately predict those traits.

[17:08] - Speaker 2
And, of course, governments and startups are not really entirely motivated to boast to everyone how accurate those algorithms are. They would just publish patents. They're actually ten or more year old patents that essentially describe the use of those algorithms to identify intimate traits. But they would never say how accurate they are. And I essentially decided to try to audit this technology to try to see how potentially risky those technologies are. And unfortunately, my research results show that those predictions are pretty damn accurate, close to perfect, actually, for some intimate traits.

[17:45] - Speaker 2

[17:46] - Speaker 1
Yeah, that's nuts. It's almost like your agency for making a first impression has been stolen from you.

[17:54] - Speaker 2

[17:54] - Speaker 1
You no longer have your own personal identity. It's been predetermined. That's crazy. It kind of brings us back to the title of our podcast, Privacy is The New Celebrity, right? It's something that is no longer a given. Do you think that's true? And what does this notion mean to you?

[18:12] - Speaker 2
I definitely agree that people started caring about Privacy much more, especially in the United States. So once again, coming from Europe, Privacy was a big topic here for the longest time, but I moved to the US maybe seven or eight years ago. That was not such a big topic. People just didn't care that much. And I think that to some extent, thanks to those scandals, such as the Cambridge Analytica, one where Cambridge Analytica was not redoing anything much different from what other companies were doing at the time, apart from the fact that they were really open about their approaches and the goals and how they went around to achieve their goals.

[18:55] - Speaker 2
And I think that this essentially woke people up a little to how powerful, how powerful it could be to invade someone's Privacy and how much power it gives to a company or an individual to know your intimate. And some of those things are obvious, right? Like if someone can know your political views or sexual orientation in places where we still suffer from prejudice and homophobia, this can be just used openly to blackmail you're, persecuted and so on. But it turns out that invading people's Privacy can be used in those more subtle ways to take advantage of them.

[19:33] - Speaker 2
Like, for example, if I know your personality traits, or if I can estimate other psychological traits of yours, like all States, such as your emotions, I can then use this knowledge to craft a message marketing message, selling you some consumer products, or maybe some political messages encouraging you to vote for something or not vote for something else. Knowing your intimate traits allows me to communicate to be essentially more persuasive and potentially even manipulative while communicating with you online. Yeah.

[20:04] - Speaker 1
I guess the silver lining of this whole Cambridge Analytica situation is that it's bringing Privacy to the forefront. And Cambridge Analytica represented a low in Privacy for many Americans. Do you think it's gotten better or worse since that happened several years ago?

[20:23] - Speaker 2
Well, first of all, it has been very bad, even way before conventionalistica companies like that existed before, and unfortunately, the situation just got worse in many ways. So of course, on one hand, we get more technologies respectful of Privacy or technologies that are aimed at protecting Privacy. But in this arms race, I think there's also a lot of companies and ideas out there aimed and focus on not only harvesting people's data, often without their knowledge and permission, but also often creating products and services that just make it so pleasant and wonderful and entertaining to just give you that way and then receive a wonderful game in return or whatever or a better experience on some platform.

[21:15] - Speaker 2
So essentially, I think that compared with five or six years ago, we on average, have much less Privacy today.

[21:22] - Speaker 1
Do you think people are aware that their Privacy has decreased?

[21:26] - Speaker 2
I think that some people are. The problem, of course, is that it's just so little that you can do. You should not and cannot stop using Google or online banking or walking the streets. Those are just not acceptable solutions, not solutions that people like to adopt on a larger scale. I think that the solution here, unfortunately, is some collective action changes in regulations. I see policymakers putting more pressure on companies to respect our Privacy. The problem, of course, is that policymakers and governments very often could criticize companies publicly criticizing them for invading our Privacy.

[22:06] - Speaker 2
But in private governments often enjoy the benefits of invading people's Privacy, such as in being able to monitor the society better and monitor for threats better?

[22:19] - Speaker 1
Well, Europe has been more successful on this front with GDPR. Why do you think it's a challenge to implement something like that in other countries and cultures?

[22:29] - Speaker 2
Well, because while I even had some little tiny bits of GDPR that I consulted on, so I'm kind of very proud and happy that this law came into effect. But we also have to remember that laws like this always bring trade offs. And in the context of GDPR, the regulatory burden is so big that startups small companies just cannot deal with it, meaning that large companies such as Facebook and Google, they can make sure that they're compliant and they can hire enough lawyers to essentially deal with this new regulatory burden.

[23:08] - Speaker 2
But it completely killed their creativity and progress. The technological progress will increasingly be moving to other environments which are not regulated in any European fashion, so that's the first price that we pay as a society, meaning essentially slower progress and technology being developed elsewhere. And the problem, of course, another layer to this problem is that consumers today they can very easily move between regulatory environments. An example, of course, is Google and Facebook, which were developed in that States from the start, were not compliant with European regulation even before GDPR.

[23:45] - Speaker 2
Those companies were not compliant with European regulation. And yet European authorities couldn't really stop Europeans from using those platforms because essentially, there was no alternative meaning that European Union essentially gave up ability to foster and shape the progress. It happened in United States. And now as United States is becoming more aware of the Privacy issues and it's ramping up its regulation, the cost. Of course, there are many benefits of it. But the cost of it is that progress will increasingly move to places like China, where the company just today care much less.

[24:29] - Speaker 2
So the bottom line is that if we want to take regulatory action, it should not be unilateral during summer in one country or one block of countries. We should work essentially as a global community here to solve those problems. Definitely.

[24:44] - Speaker 1
I hear you when you say that we need to have better regulations from a government point of view. But should that design also be the responsibility of individuals? And how about the responsibility of tech companies? And while tech companies aren't our elected officials, one could even say that they hold so much inherent power. Right. Like you were talking about, how does an individual get away from using a behemoth like Facebook or Google? We did a focus group at Mobile Coin, where we asked people, what do you think of social media platforms like Facebook?

[25:18] - Speaker 1
And we had people say, Well, we want Privacy, so that doesn't jive with our daily life. But at the same time, they said, Well, we love using WhatsApp? And when we told them Facebook owns WhatsApp right? And they're like, oh, yeah, but how can you get away from that when all your friends are on there? So what is the individual to do when you can't avoid using tools that infringe upon your Privacy?

[25:45] - Speaker 2
Well, there is a big problem. It might be that individuals really can't do much. And of course, I would encourage everyone to use Privacy protected, acting technologies as much as they can. But we also have to be aware of the fact that there's just no escaping. At the end of the day, we can switch to more private messaging platforms, but your purchases will be still visible to banks and financial regulators. Your geographical whereabouts will still be tracked by the government and a bunch of companies that have access to your smartphone or to the GSM network.

[26:18] - Speaker 2
So essentially, at the end of the day, we have to take collective action, protect ourselves and educate ourselves and our friends as much as possible. But also on top of that, try to essentially as a global community, because even at the level of the country of individual country, it's not enough. It's not kind of meta level enough. We have to go on the level of the global community. And I wanted to mention one more thing here, which is, of course, we should demand that companies such as Facebook and Google are more responsible.

[26:48] - Speaker 2
But we also should remember that those companies are not run by elected officials. They may be driven by Ideologies that we don't fully agree with. So having those companies now decide on what's okay, and what's not okay, what content can be published online, how our interpersonal interactions should be governed. I think this will be asking for too much. Those are private, profit driven companies, and they should stay this way. We should have smart regulators and smart policies where we, as a society tell those companies what we want them to do rather than demand that those companies figure it out on their own.

[27:28] - Speaker 1
Yeah, I think the question really is, how should tech companies be regulated when they have all the power? How does this regulation work? You just gave an example of an idea in theory. That sounds beautiful, but when it comes down to implementation and execution, is it still flawed? I think that we need to appeal to the actual ethics and ethos of these private companies in order to get them to care about people's Privacy because they can make as many back doors as they want.

[27:59] - Speaker 2

[28:00] - Speaker 1
Mobile Clean is open source, but not all companies. How are we supposed to know? How are people in government supposed to know that tech companies are actually respecting regulations when they have all the power already?

[28:11] - Speaker 2
Well, I think there should be some trust, but the trust should be limited and good regulators have also ways of checking whether companies are complying with regulation or not. But I would still be reluctant to essentially leave the decision making powers to the companies themselves and demand that companies do the right thing without telling the companies what the right thing is, because if you do this, then you will have Hobby Lobby situation where the people who run the company their values were misaligned with values of many other Americans, but they were valid values they had, and then they try to enforce those values on their employees and maybe in some other cases on their customers as well.

[28:55] - Speaker 2
So the problem with letting private institutions enforce their values on their employees or their customers is there's no problem if you agree with the values of those institutions, but sometimes we'll have institutions with the whom values you don't agree. And this is why I think that the safe solution and the solution that Democratic society civic societies have developed over millennia really is to let the profit driven companies try to focus on making their profits and focus on regulating them in a smart and sensible way and in this way.

[29:33] - Speaker 1
Giving those companies direction and telling them what is okay to do and what is not okay to do makes sense for listeners who don't know Hobby Lobby what are you referring to?

[29:44] - Speaker 2
I'm referring to? And if I hope my memory is not fooling me here. But I'm referring to the situation where Hobby Lobby requires that the health insurance for the female employees does not include anything related to contraception or family planning or birth control. There was, and I can empathize with this upward. There was an upper in society. And I agree with this outrage. But yet I also see that those employers were following some values that they had that I don't agree with. And this is why I'm afraid to leave those decisions related to values to private corporations, because sometimes you agree with the leadership of those companies, but sometimes you may not agree with them, and then we are in big trouble.

[30:33] - Speaker 1
I'm glad we've had this in depth discussion on changes on a grand scale that we would like to see made for Privacy. Going back to day to day. We talked about how cameras can track us using just a facial analysis. And I've lived in countries where they say CCTV can identify and track you for up to miles to see what you're doing. Airports. Now, when you go, you don't even need to scan your passport. They just know who you are, even with your mask on what are your Privacy best practices for day to day life?

[31:10] - Speaker 2
But once again, I think that there's very little we can do to escape the big brother of corporations, governments, and very often even small startups. So I just kind of try to make peace with it and try to focus essentially on making those issues more visible and essentially educate people about the risks that the losing of Privacy is presenting us all with. And essentially because there's no escape. I'm not really trying to escape much.

[31:45] - Speaker 1
That sounds pretty bleak. So we just give up and give in and hope for a better future.

[31:51] - Speaker 2
Well, we can do many other things. So just to come back to something we're talking about a bit before the real solution to homophobia is not giving everyone the absolute right to Privacy of the sex orientation, which would be nice to have the solution to homophobia is a more open, tolerant society with laws protecting the minorities and more positive and tolerant interactions.

[32:14] - Speaker 1
Yes, I agree that a change in cultural tone is something that's very important. And I would even say that big companies, as much as governments have a responsibility for setting a positive cultural tone.

[32:32] - Speaker 2
One could be a leader of change in so many other ways, and one could actually make this maybe a slightly counterintuitive argument that if you can avoid losing your Privacy now, you probably should do it, because by losing your Privacy now, you are going to drive the change and removing some of the dangers that people are now facing losing their Privacy. And if this sounds absolutely stupid and radical, let's just focus for a second on the fact that it happened in the past. Harvey Milk was arguing that if you can afford to lose your Privacy and be public about being gay, you should do it.

[33:14] - Speaker 2
And in this way, show to your neighbors, show to your employers, show to your friends there's nothing evil and nothing wrong with being gay. And Harvey Milk, as we all know, paid the highest price for coming out. But thanks to his heroism and the heroism of many of his contemporaries and people living today, we are now living in much more tolerant but far from perfect but much more tolerant and much more inclusive society than Harvey Milk experience in the past.

[33:42] - Speaker 1
Yeah, I think being able to afford, like you said, to be completely open about your identity is a really powerful thing. But these days, Privacy extends to just even our day to day actions.

[33:57] - Speaker 2

[33:59] - Speaker 1
What you purchase, where and when, what your friend purchases? What does that mean about you as a culture? Don't you think there's something in the consensuality of giving your identity, of showing your identity? Where does that go if everybody just gives up and gives in?

[34:17] - Speaker 2
Well, I think that people live in those two parallel universes, or actually probably people living more parallel universes. But we in our daily lives, would be very reluctant to reveal our browsing history to our closest friends or even our partners. We have some search queries that would never enter into Google in public. We buy stuff online that we will just never share, even with the closest friends and family. And yes, we are doing it on our computers when with our GSM operator watching and our operating system recording everything we do and a bunch of apps recording what we are doing and so on.

[34:56] - Speaker 2
And somehow people don't care about that. And this is just always mind blowing to me that we apply those completely two different Privacy standards to our interactions with even the closest people to us that we share so many other intimate things with. And yet we seem to be completely open with engineers working at Google and Facebook and other companies that collect all of this intimate data about us.

[35:19] - Speaker 1
Well, I will even say it's because you don't know what they're going to use with that data. Right? Data mining people selling your data, that's all something that individuals have to deal with these days. So you're not consenting for your data to be sold to this company you've never heard of or even to this government that wants your information? And don't you think there might be consequences to having all of your personal information out there to be judged out of context? What about for when you're applying for jobs?

[35:48] - Speaker 1
What about for when you're trying to get a lease and people are perhaps making inferences about you based on data that they received non consensually without any context?

[35:59] - Speaker 2
Well, there's absolutely no question that apart from the benefits, I just want to stress once again that we should also remember that the environments where data is being shared and in an ethical and respectful fashion. And there are many situations where the sharing of the data is improving products and services and improving our lives. But of course, there are also risks, such as insurance company taking your data and predicting your future health issues and then hiking your insurance rates, refusing to insure you. So there's definitely risks there.

[36:39] - Speaker 2
But again, I think the solution to those risks are not in just expecting that we somehow can have perfect Privacy. The solution is smart regulation that protects us from the prices that we have to pay and protect us from the risks of losing our Privacy. Like in the context of insurance. Yes, insurance companies can be very intrusive, but we could demotivate companies from being intrusive by essentially saying, look, well, you can try to learn as much as you want about people, but you should offer them the same insurance rate to everyone like we do in Europe, by the way.

[37:13] - Speaker 2
And suddenly, in the context of having to ensure everyone at the same price point, the insurance company then still may want to learn about you as much as possible, but now this knowledge will not be used to refuse to insure you, but rather to maybe help you with avoiding the disease before it happens and trying to help you to lead happy and healthy life. And on top of everything else. I want to also point out that while sharing the data is very dangerous, on the individual level, there are also very big social level positives.

[37:49] - Speaker 2
Like, for example, you sharing your personal health information, your diagnosis, your medical history and so on can be very dangerous and very unpleasant individually. But on the societal level, being able to access data of hundreds of thousands or millions of people helps medical industry, medical researchers, government health authorities to manage pandemics, better understand the efficiency of drugs and medical treatments, better to design better drugs and better medical treatments, and in this way make our lives better, safer, longer, which essentially clearly shows how this dangerous thing on the individual level can bring very significant social benefits, which essentially brings me to the following perhaps controversial thoughts that maybe we should start thinking about sharing data in the same way as we think about taxes.

[38:50] - Speaker 2
No one loves paying taxes, but we all understand that by giving something up, we gain something as a society. And I think the same may relate to data in the future. Agreeing for some of our data to be ethically shared with others, we can gain something in the society.

[39:09] - Speaker 1
Yeah, I think the ethical part is definitely the part that Mobile coin and myself personally would love to see included. And it sounds like I hear what you're saying. What if there was some incentive to share data? Currently, it really seems that Privacy and profit are at odds. But what if we could shift it so that there is some sort of profit or incentive for companies to ethically obtain data and for individuals to ethically share data completely agreed.

[39:41] - Speaker 2
And there are many ways where individuals could share the data in a fairly anonymous fashion. Maybe there are many ideas when companies and individuals propose to pay people for the data. And I actually don't think that this is doable simply because the moment we start paying people for data in actual cash, and then there will be a lot of fake data being created, and then, of course, the value of the data will go down. But people can be paid for data with wonderful products and services and better services, because the more a platform can know about you and the better to understand you, it could be a win win situation there, the better they can serve you.

[40:25] - Speaker 2
But I'm hoping that at some point as a society, as a global society will find a way of sharing our personal data with institutions such as universities and medical researchers in such a way as to allow those institutions to come up with better medicines, better diagnostic tools, better dietary advice, and in this way make our lives better.

[40:51] - Speaker 1
Yeah, I think the word sharing is key and consensuality is key. Would you personally ever give up your right to Privacy?

[41:01] - Speaker 2
Yes, of course. I think that's because many other people do not have a choice, and I think personally have enough technological Privacy and enough education in this area to somehow protect my Privacy. It would cost me a lot of effort, and it will cost me. I would have to give up a lot of things that I enjoy doing. A lot of technologies I enjoy using, but I guess I probably could be better protecting my Privacy. But I've decided to leave a fairly public life. Whenever I write an email, I try to write it under an assumption that it can be one day made public.

[41:40] - Speaker 2
Whenever I write a message. Whenever I take a note, I always try to assume that it could be one day public. And first of all, I think that it makes me safer because I'm less prone to blackmail. But also, I think it makes me a better person, because if you think while writing an email that this email may 1 day become a public knowledge, I feel that most of us would end up writing slightly better emails, but this, of course, is just scratching the surface because there's so many other aspects of our lives that we should have control over.

[42:15] - Speaker 2
And I think that one point that I want to mention here, I'm making this choice, and I think that everyone deserves to be allowed to make this choice as well. Unfortunately, of course, based on what I said before, I'm also increasingly convinced that this choice is being taken away from us by new technology.

[42:35] - Speaker 1
Well, hopefully we can put some technology out there that respects the consensuality of data while still seeing the bright future of how data can improve technology and daily lives. What do you think is a Privacy technology that does not exist right now, but that you think should exist?

[42:55] - Speaker 2
That is a great question. If I knew, I probably will be just running a start up now and getting so much money from the investors for this wonderful new technology. I think that there was a good way of really, truly anonymising data. So the problem with data is that even if you remove those obviously non anonymous bits of it like your name and your email and your IP address and your date of birth, and then people think, oh, my God. Okay, my data is now anonymous. No, it's not.

[43:29] - Speaker 2
There's only one person that went to a particular store at a particular hour on a given day, and then yet on another day previous month, there's essentially those patterns in the data that makes you absolutely unique. And I think if there was a way of being, getting better, anonymising this data in such a way that it still remains useful for improving some products and services, such as if we were able to share our location data with Google Maps to help you to become better at navigating people from point A to B and yet maintain our privacy, which I don't think is at the moment possible. This would be a technology I would really love to see.

[44:15] - Speaker 1
Yeah, something that respects both innovation and progress of tech and the privacy and agency of the individual. That would be great. Well, I look forward to Michal's anonymity cloak when you design it.

[44:28] - Speaker 2
Yes. We're starting the start up next month.

[44:31] - Speaker 1
Count us in. Michal, I can't thank you enough for joining us on Privacy is the New Celebrity.

[44:37] - Speaker 2
Lucy, thank you so much for having me, and I'm looking forward to losing some of my privacy with you guys in the future.

[44:49] - Speaker 1
We've been speaking with Michal Kosinski, a professor of organizational behavior at Stanford Graduate School of Business and the author of many, many research papers that have truly transformed the way we think about data and privacy. Thank you so much for tuning in and please subscribe to Privacy is the New Celebrity on Apple or Spotify or wherever you get your podcasts. We'll be back soon with another episode. And don't forget to check out MobileCoin Radio. It's a unique live stream we do every Friday at 01:00 p.m. Pacific Time, featuring different musicians, performers, visual artists. You can find that along with the full archive of shows on

[45:28] - Speaker 1 
Thanks for listening. I'm Lucy Kind. Our producer is Sam Anderson, and our theme is composed by David Westbomb. Have a great week. And remember, privacy is a choice we deserve.