The Decentralists

Hot Topix: Clearview in the rearview?

March 18, 2021 Mike Cholod, Henry Karpus & Chris Trottier
The Decentralists
Hot Topix: Clearview in the rearview?
Show Notes Transcript

Clearview AI, the tech company that provides facial recognition services, has been a source of controversy for quite some time—we’ve previously blogged about them. A recent investigation found that Clearview AI’s “scraping of billions of images … represented mass surveillance and was a clear violation of the privacy rights of Canadians.”  

Clearview stopped offering its services in Canada in 2020 and suspended its contract with the RCMP—but an ongoing investigation (by the Privacy Commissioner of Canada) is looking into the RCMP’s past use of Clearview’s services. 

In the U.S. it seems that Clearview AI wants to take matters of facial recognition privacy to the supreme court

Why is Clearview AI so problematic?

Why is law enforcement use of Clearview AI’s technology such a concern?

Is this the most egregious example of surveillance capitalism out there?

In this episode, we discuss why no company should ever put our right to privacy at risk.

Henry: Hey everyone. It's Henry Mike and Chris of the decentralized. We've got an interesting hot topic today because we're going, we're going back. It's a bit of a throwback. We spoke about this company called Clearview back in August. They are the Kings of surveillance capitalism and what has happened since then, just got to deeper and darker. Now we've got, immigrant groups, activists, rights organizations who are suing Clearview because they feel what Clearview is doing is unconstitutional all across the United States. The interesting part is Clearview is already banned in Canada. So, Chris, for example, I know, you know, an awful lot about this operation. I'd like you to start and sort of bring everyone back up to speed as to what clear view is and Mike, I know you've got a lot to talk about,

Mike: Oh, I'm ready.

Henry: Chris, over to you.

Chris: So, Clearview, AI is a facial recognition database. That's incredibly popular amongst law enforcement, each such as the FBI. The reason why they're able to compile all these faces in their database is because they illegally troll through, social media profiles and down and download them in doing so. They, violate various terms of services of all these social networks.

Mike: Because they want to keep it to themselves.

Chris: Because they want to keep it to themselves and so while police may find Clearview A to be very useful the problem is it's a massive open gateway to racial profiling. it entrenches, deep-rooted, systemic racism, that your local police department might have. Right. But there's a bigger problem with clear view AI and it's that it has strong ties to you. White supremacist organizations.

Henry: It does.

Chris: Oh yeah. So, let me give you an example here. Okay. So many of Clearview AI's employees come from funny enough bright parts. There are ties to Andrew Heimer who was the webmaster to the neo-Nazi site, Daily Stormer, and conspiracy theorist, Mike Cervi, there's a fellow who works at Clearview AI. His name is Marco Guich, who openly advocates for segregation in the separation of juice.
 
 Henry: Are you kidding me?

Mike: You've got to be kidding me.
 
 Henry: Not today. That's not, that's not possible. Really?
 
 Chris: Yeah. So, he's, he's employed and.

Mike: Somebody better talk to Clearview's HR department.
 
 Henry: Oh, they don't have one.

Chris: One of Clearview AI's lawyers Tor Ekland is known for representing these various far provocative and racist. The daily Stormer web website, but then the whole thing gets a lot more interesting. When you look at one of the founders. One of the founders, his name is Hoan, I'm not sure if I'm pronouncing his name correctly. He's an Australian entrepreneur who now lives in the US. He first was brought to media attention because of a known fishing scheme. So, it's crazy in my opinion, that the police are basically supporting an alleged criminal. Sure. He hasn't been convicting did, but you know,

Henry: Okay and that's because Chris, the police departments all over America at every level, many of them are customers of Clearview. They use the service because they can watch and analyze, data that they get and find out who is whom.

Mike: Well, it's an, I mean, it's more egregious than that, Henry. I mean, let's face it Clearview AI did. It has a very simple premise. They basically went out, sent a bunch of bots out, crawled over, and scraped all the social media sites collected three or 4 billion pictures.

Henry: What billion.

Mike: Billion, and they freely this because it's part of their value proposition and then they basically put it together in this nice convenient app where even a mobile version where you can literally any human who has their hands on this app. Okay. Can walk down a street now, assuming there were people walking on the street and just hold the phone up with the camera and it can pick people out, just strangers walking by, and it'll compare it to this database of 3 billion photos to tell you who they are.
 
 Henry: You mean that can be done in real-time on a mobile phone. Absolutely. I envision this with surveillance cameras and all uploaded and in the police office, they actually do it there. This is crazy.

Chris: No, it's that easy?

Mike: It's that easy? I mean, in fact, I think one of them, I remember that one of the very first stories that I read on Clearview, it just must be over a year and a half ago now was talking about how one of the investors, okay. So, there's this big rich guy in New York, who's one of their investors and he had an early version of Clearview and he was using it to do background checks on the potential boyfriends. His daughter was dated. Seriously, you'd probably like that Henry, but what I'm saying?

Henry: Oh, both of my daughters are married. It's all good. They're great guys.

Mike: But you know what I'm saying? This is, and so if you talk to Clearview and in some of them, basically recent press and stuff around these guys over the last year and a half, what its usual least set up as is, okay. The first thing they say is, well, it's not like you can just go and download this off the app store. Right. We don't make Clearview AI available to anybody just generally. So, the idea is the average human theoretically unless you're a rich guy in New York, apparently can't have this mobile app on their phone. Okay. So, you, you got to trust these guys that they're not that that's the truth. The second part of it is. So, the idea is the workflow you've got this combination of an app and a database of 3 billion scraped images from social media and stuff. So, Regular human beings and probably a few garden variety criminals here and there have been posting pictures of themselves on Facebook, and LinkedIn, to Twitter and Instagram, and all these things for years. Now their facial recognition pattern has been taken from those photos without their consent and without the, of the platforms and used to compile a database that can now be compared to the stuff on the big cameras that the police are looking at.

Henry: Yeah amazing.

Mike: Right. So, if you're sitting there and this is kind of where it starts to go with, this lawsuit in California. Okay. Because if you take in combination this scraped biometric facial recognition information, right. Combine it with live facial biometric information in police databases, which have all been kind of debunked as being unable to reliably identify certain people or maybe colour or with certain racial profiles or that's right.

Henry: I read about that.


 Mike: You have a mask on because of COVID. Right, and so the idea is you've already got this infrastructure in place with law enforcement where they're watching all these cameras, they're checking everybody and they're taking facial recognition patterns okay. Those are not accurate, right. Definitely not accurate enough to convict somebody and then they're comparing them to a database of illegally scraped private photos. So, think of what can happen, your identical twin is a bank robber. Henry, and you are not you are a law-abiding citizen in BC, and all of a sudden, now you go across, you go in someplace where there is, a lot of biometrics being used like airports, right or hospitals share transportation, heck just walking down the street, depending on where you are and now all of a sudden, the police are alerted to your presence. Right. Assuming you're the bank robber. So, this is kind of the real danger of posting personal information on social media. This is another wake-up call and reminder you hear lots of stories about people posting things, politicians posting posts when they were 18 years old and drunk or something and then 10 years later when they run for office, somebody goes back and looks at their history and they basically lose their nomination and their house. For something, they did 10 years ago.
 
 Henry: When they were a child, that's bad enough.

Mike: But now regular humans who are using these platforms, these centralized platforms to communicate and share personal information are now being proactively profiled by law enforcement.

Henry: You mentioned something very interesting, a moment ago and that was as far as you know there's never been a conviction based on Clearview.

Mike: As far as I know, but remember the law enforcement agencies that use it are not exactly forthcoming with the fact that they use it.

Henry: Well, that's true. But also, and that wouldn't even matter because they could as a tool to further their investigation.
 
 Mike: Well, exactly.

Chris: So, Henry, something to kind of remember here is that Canada's leading privacy officer Daniel Thierrien. During a news conference last February, he said that clear views actions are kind of like putting people into a perpetual poll line. That's one reason why Clearview AI is now banned in Canada.
 
 Henry: It's not even banned guys. It's illegal.

Mike: Oh, that's fantastic. Right.

Henry: The RCMP were using it. They were found to be using it and they said, Nope, that's it. It's illegal. But because think about it, you know, I mean, this is the danger, this is a perfect of what people who have been, have been crowing about facial recognition for years. Because of the danger of facial recognition, if you have a technology that is not a hundred percent accurate. Where the danger is. Is that you have law enforcement say something happens, your identical twin robs that bank that we talked about earlier. you've got the law enforcement sitting behind a bunch of cameras watching a bunch of video feeds and comparing it against a database in Clearview AI. Right. Which then goes back and does criminal background searches.

So, the problem is just because you may have some kind of criminal conviction okay. Does not mean that you committed a crime you're being accused of you know what I mean?

Mike: So, if you are unfortunate enough to be walking down a street and be caught on one of these cameras and which creates an alert for law enforcement, potential criminals walking down the street, or past criminals walking down the street. Right now, all of a sudden, if something bad happens like that bank gets robbed or somebody gets mugged or something like this, guess who's going to be the first person to get a knock on their door. Right.

Henry: So now guys, what's the issue right now in California?

Mike: Well, it's kind of coming to a head, this issue. Okay. The is in California. What you're seeing is you have I in the United States in general, they didn't respond like Canada law. Enforcement's a lot more fractured in the United States. Right. So, it's been banned. Clearview has been banned in cities like I think Portland and San Francisco and Boston and places like this and so what you've got now is you've got specifically human rights groups and immigrant rights groups. Okay. Who are alleging that this Clearview technology, which is has been proven unreliable facial recognition in identifying people of certain races and cultures? Right. Not culture, but you know what I mean? Um, characteristics. Are not reliably identified. Okay. So, they're already the technology's kind of wrong. Now you're taking two wrongs to try to make a right.

So, what these groups are saying is we represent people who are securely in that we get it profiled incorrectly category by facial biometrics and they're saying, we, this technology is not reliable. This technology is illegal. This technology cause violates people's rights and it should not be legal in California. So, they're basically suing to have Clearview AI. It's not one of them, these things. I don't think it's not one of these things where they're looking for a bunch of cash, right. They're looking for California as a state to ban the use of clear view AI with law enforcement in the state.

Chris: Okay. So, I also want to say here that not only do they have a pretty good case about, its unreliability, but also the production, the making of Clearview AI, has been made with a bias, against undocumented immigrants.
 
 Henry: Really?
 
 Chris: Yes. One of the folks who has ties to Clearview AI, his name's Pat Dickinson he's well known in the far right. Extremist sphere, he has posted about using Clearview AI as a message to identify undocumented immigrants.

Henry: So, how do you tell if somebody, if somebody is documented or undocumented, that's ridiculous?
 
 Chris: Exactly. What, I should say here is, is Pat Dickensen. Isn't just some schmuck who's using this software. I mean he is a close per of the CEO. So, if the making of the product is tainted, to begin with, they should fold up shop.

Mike: Right. Well, I mean, and guys like, let's face it, right. This is this type of technology, some people will say it would if it helps prevent say a terrorist event or something like that, it's a good thing. But it's, to me, it's the most egregious example of surveillance capitalism, right? This idea of taking people's data, watching people's actions, taking their data, and using it to manipulate them. I mean, this is literally the most egregious example. Of course, they are, it is a surveillance app, you know what I mean? And it's almost comical to me that these, and this is, I think the real danger with Clearview. This technology does not need to be mobile.

Henry: Very good point. Exactly because it requires analysis in its purest form.

Mike: Well, you know what I mean, if you're going to compile a facial recognition database and sell it to law enforcement to be compared to say, live camera feeds, right. Why do you need to go ahead and make a mobile version of that? You know what I mean? Because it's not like police, like, I mean, and you basically, do you think that the real upside is police walking around with their phones, kind of looking at the audience. I mean, it's ridiculous. These guys clearly have no problem, being self-professed, racist, and anti-immigrant folk and being public about it. I mean they built a mobile app, and so this is the type of stuff. These are the types of things that when people argue about systemic racism and racial bias in society, this is the stuff that confirms it.

Henry: No question about it.

Mike: To Chris's point, that's why this thing should be banned and to me, it's the type of thing where, and I'm betting that even Google and apple can't get involved in this, because I'm betting, you're not, it's not just, it's not gone through the Android or iOS app store, right? So, this shows another interesting wrinkle. I mean, there's, they exercised their right to control what was on their platform in the case of Donald Trump and parlor by de-platforming them. But now what do you do if all of a sudden, the law or the Supreme court says you can't use clear view, blah, blah, blah. I mean, how do your de-platform something that isn't on a platform?

Mike: Well, you can't, I mean, this, isn't a private company that is selling a product and obviously they give you the keys to download it and then run it, and so it truly is something that's that is just out of control.

Henry: It's despicable.

Mike: Okay. Well, one question I want to hear from both of you. What do you think is going to happen or what do you envision with this lawsuit? What happens if Clearview loses or wins and do you feel that let's hope they lose? Does that signal a slow death of surveillance capitalism? Or are we already way underwater Chris?

Chris: Well clear view AI is to me the most egregious of the surveillance capitalist defenders and the sheer fact that they've been able to survive for as long as they have is well, it doesn't bode well for privacy, in my opinion, from my standpoint good rid to them. If in fact, the suit shuts them down, but if they're somehow able to keep running in zombie form, how long will it be? How long will it be till we really become totalitarian surveillance states?

Henry: Exactly and how long have they been around Chris approximately?

Chris: Since, 2017. So, they've been, illegally storing our social media photos for at least four years now.

Henry: And of course, we're enabling all of this by everyone uploading their own pictures all over the internet, their brother. What do you think Mike?

Mike: Well, I mean, I think so the immediate-short answer is if they lose the lawsuit, it's illegal to use it in California. Okay. That's, the goal so that would be a victory without a doubt. Because California's a big state with lots of people and, all this other kind of stuff. But I think, you know, the reality of it is Clearview to me is indicative of a much deeper problem and it's this issue of endemic and systemic cooperation between tech and the government authorities in various countries. To surveil and provide that data to various different. Let's say visible and invisible law enforcement agencies. So, on the one hand, I feel that this, I know this technology is despicable and all this stuff, one of the positives about it is that people do know about it.

Reporters have found out about it. They know it exists, they are exposing things like who it is who's behind it, what their motivations are. Okay. But the issue is selling personal data to law enforcement so that it can be used for whatever purpose is. The biggest is the bigger issue. So, for me, my hope is that as you know, Chris and I both said, because I love the word egregious. So, let's put it out there for the third time as this really bad example of surveillance capitalism, the issue is my hope is it will draw extreme attention to the kind of hand in glove between tech and law enforcement that has been used to unfairly pigeonhole people and subject them to continued scrutiny, shall we say for just being people who were communicating on the internet.

And so my hope is that exposes the depths that people in tech are going with our data and it wakes people up to, thinks twice, three times about posting anything on any of these platforms. So, and I think, one of the other problems is that it's I said, it's indicative of other things, Chris, we talk about, we've talked about Peter teal, many times in the blog in the podcast. So, Peter teal is one of, these is one of the original investors in Facebook

Chris: And PayPal too,

Mike: And PayPal, but he also has a company that's kind as similar to Clearview, but just not as much in the targets and that's Palin here. So, you've got Peter Teal, who's behind all of these different social platforms has a company himself that sole purpose is to compile data, social data on people, sell it to law enforcement, defence agencies, things like this, and so you start to see these weird connections, right. You start to wonder, was this surveillance state part of the game plan back in 2004 when Facebook was still in a Harvard dorm room?

Chris: I just want to mention here, Mike, that Peter Teal is one of the main, funders of Clearview AI.

Henry: Well, he seems like he's funding everything.

Chris: He, well, he's funding, a lot of things that have to do with surveillance capitalism.

Mike: He's like the emperor he's Boston Darth Vader around

Chris: Here's the thing is that we cannot separate apps from the people who make them. If you're noticing a pattern with Peter Teal it's that he tends to focus on surveillance. Well, if you want to know, if an app is to turn out good in the long run, focus on who on where it's getting its funding. In the case of Peter Teal, I wouldn't touch anything that man funds, because unfortunately it's got to a propensity to not play nice with its readers, with not readers, its users.

Henry: You have mentioned Peter, many times throughout, many of these hot topics and decentralized episodes and, it never ends well, except for the fact that he makes a massive amount of money.

Mike: Right. Well, guys let's remember, let me ask you guys both a question. If I was to ask you what surveillance is, right. What would you think it was Henry?

Henry: Well, my first immediate thought is old school movies where there's a microphone, a bug in a room, and then it moves on to, oh, places like London, England has a ton of cameras and they watch you and the same thing happens of course, in a lot of Asian, countries and then I start thinking about, the internet and then I start to lose my mind.
 
 Mike: Well, and Chris, when I say surveillance, what do you think?

Chris: I think of some shadowy figure, watching me taking notes on everything I'm doing correctly.

Henry: Classic.

Mike: Correct. So, you guys have both identified probably what, 90% of people, 99% of people would say, okay, surveillance is the an of watching someone right. To see if they do something wrong. The guy across the street with the binoculars and the bug in the room, like you said, Henry, and they're basically, they've got this idea that this person is some international, an international person of mystery, and they're going to follow them around. Okay. But what clear view and, and volunteer, and this generally this idea of surveillance capitalism, which is Facebook, Google, Amazon, right. Is not watching people. It is proactively identifying people. There's a very subtle difference when these tools are available to law enforcement, right. Where, when a police officer or an FBI or an RCMP, or somebody like this, whose day job until last week was sitting at a bank of cameras and looking to see, if anybody did anything wrong, see if somebody breaks a window and breaks into your car. Now that person who just got Clearview, AI, or poller in their environment is watching the same video feeds. But basically, just not even having to look and see if anybody does anything, they're just looking for big red blobs to pop up saying criminal, criminal, criminal,

Henry: What, you're absolutely right, Mike, because what you've said in the classic definition of surveillance, the people who are doing the surveillance already have identified their target and they're focusing on that one person, but in this, it's a free for all.

Mike: That is exactly the point. That is exactly what people are saying in California as part of this suit. They represent people who maybe. Let's say they're Hispanic in California. And just because you're Hispanic in California now law enforcement has you pinned as some kind of potential illegal immigrant. Okay. So, the deal is a proactive comparison of failed facial biometric information, dodgy information that proactively identifies potential criminal elements means that there is absolutely no way any of us can be assured that we will not be mistaken for somebody who is under, suspicion cannot be proactively identified by some law enforcement agency as being under suspicion and then arrested for something that they have. Absolutely no idea if you did it or not and now, you're like this poor guy, a couple of weeks in Ontario who spent six days in a jail cell.

Henry: In Montreal.

Mike: In Montreal, because somebody took a camera shot of him when there was some incident with a Montreal policeman.

Henry: In the area. He happened to be in the area.

Mike: Just happened to be in the area and just happened to be of immigrant, visible minority descent and so what happens? The cops take the easy route. They pick the guy up, put him in jail for six days, and then say, oh yeah, that's right. It's we don't have any idea. We don't have any evidence against you guess we got to let you go.

Henry: Oh, oh yeah. And he was a law student or something,

Chris: A med student.

Mike: A med student she's anyway, you know, guys, I mean, this is just, this stuff makes my blood boil because people just, we need to be more responsible. We need to be more thoughtful. It shouldn't have to be this way, but the fact that there are people out there with profit incentives based on your personal data that you're sharing means you're never going to be okay on these platforms. We're all in trouble.

Henry: Well, Mike, with our decentralized mini one app, which is going to be released in the next few weeks, that can't happen.

Mike: Well, I mean, it's at well, what can't Henry is they can't scrape us. You can't scrape you. They can't scrape the pictures of your kids that you send to your family. Okay. Because it's not on this wider network, it's not available to anybody else. So, the reality is if you're going to share your life, share it privately, securely, decentralized, get it out of there. Okay. So, that it can't be scraped by people like Pal Tear and Clearview. If you're going to go out and put cat memes together, or stuff like that, feel free to go and knock yourself out. Because as far as I know the clear view thing, doesn't work on cats.

Henry: That's a great way to end it. Mike

Mike: That's hope so. I mean, cause otherwise maybe we could, maybe there's a thing we could use Clear view for you could repurpose it to walk around and take pictures of all these lost cat photos and see if you could identify them.

Henry: Mike, Chris, thank you very much. It's a very interesting and a bit of a scary topic as well. But I think we really dissected it. At least I can feel fairly good because it's illegal in Canada. I certainly hope for the best for all of our friends and American listeners. Gentlemen. Thank you very much.

Mike: Thank you, Henry.

Chris: Thank you.