The Decentralists

Hot Topix: The Apple Doesn't Fall Far From The Tree

August 26, 2021 Mike Cholod, Henry Karpus & Geoff Glave
The Decentralists
Hot Topix: The Apple Doesn't Fall Far From The Tree
Show Notes Transcript

Apple recently announced a new feature to be released in iOS 15 that would allow all photos stored on an iPhone and in iCloud to be scanned for child specific abusive material (CSAM). Theoretically Apple would be able to compare every iPhone photo to known CSAM material and identify people who create and share such villainous content.

What's the big deal?  Why is Apple getting so much flack over this decision? On this episode of The Decentralists Henry, Mike and Geoff talk about Apple and their decision to join the rest of Big Tech in readily violating their user’s privacy.

Is this technology the back-door into iOS the swore they would never make?

Henry : Hey everyone, it's Henry, Mike, and Geoff of The Decentralists, and welcome to season three. Who's Geoff? Well, that's Geoff . He is our brand new co-host and he's been a product manager for many years in the technology industry and he's been very successful. He's also our Chief Product Officer and Director of the Pure Social Foundation. You may even recall he was one of our guests about a month ago on The Decentralists. He shared with us the challenges that so many tech companies face when they're trying to position social media apps. Welcome, Geoff.
 
 Geoff : Thank you, Henry. It's great to be here and great to kick off season three, giddy-up. 


Mike : Welcome, good sir.

Henry : Today our hot topic has got a rather interesting name: The Apple Doesn't Fall Far from the Tree. What's that all about? Apple recently announced a new feature to be released in iOS 15 that would allow all photos stored in iCloud to be scanned for Child Specific Abusive Material also known as CSAM. Theoretically, Apple would be able to compare every iCloud photo to knowing CSAM material and identify people who create and share such villainous content. That goal is certainly laudable. Nobody should support the creation and distribution of CSAM. In fact, every effort should be made to stop it. However, this time has Apple gone too far down the slippery slope of violating iCloud users’ privacy. What is the big deal? Why is Apple getting so much flack over the decision? Join us on this episode of The Decentralists, as we try and break down the decision that Apple made to join the rest of big tech in readily violating their users’ privacy. So gentlemen, what's the background of Apple's recent decision to announce CSAM photo scanning in iOS 15? Mike, can you start us off?
 
 Mike : It's one of these things where Apple just announced a new feature in iOS 15. That feature was meant to be a good thing. What they were going to do is they were going to basically enable any photographs that are taken or shared on an iOS device and then upload it into iCloud. Once they're uploaded into iCloud, which is where most Apple, iPhone, or iPad users back everything up. They sign in and they back it up. Your photos get backed up into iCloud. This is supposed to be your account and your backup. What Apple has announced is what they're going to do is they've got a database of these pixel-by-pixel maps of known CSAM material. Stuff that’s been collected by law enforcement, everybody's probably heard horror stories of these things when they bust these people's computers. The idea is they've taken all these images of known child-specific abuse of material and they're going to compare them to the pixilated images that are being uploaded into iCloud. They went as far as saying that if they detected a match they would suspend your account and report you to law enforcement. 


Henry : This is a good thing, right?


Mike : On the surface, it sounds like a good thing.  There’re a couple of things we got to think about. The first thing is if they can scan your photos for CSAM, they can scan it for anything. If somebody decides down the road that they want to scan for pictures of people carrying protest posters, that’s the first thing. But the second thing is, think of what this is. What this is notice and almost like a request to make a back door into iOS. 

Henry : I don't understand that. What do you mean by that? 


Mike : We talked about this a couple of weeks ago, but there was that terrorist that they found his iPhone and the FBI went to Apple and said ‘We need you to break the password thing because it's 10 times and then it's wiped.’ Apple said ‘no, we're not building a back door into the OS’. So the FBI said ‘fine’. They went to these guys, the NSO group in Israel, and got them to make basically that Pegasus spyware to crack the phone. Why did they do that? Because they wanted to get access to the information that was on the phone or in iCloud. If you think about it Apple already has all of the pixilated maps of everybody's photos that go up into iCloud. They already do because it's a process that's necessary for this speed of this photo sharing. You know how you get the little thumbnail in your email and you click the thumbnail and then the picture opens up? That's because the thumbnail is this little pixilated map. 


If they've already had that capability, what is this announcement? This announcement is we want the ability or we are building in the ability to compare anything being uploaded into iCloud to an external database. If I'm the government of some country that does not believe in the rights of somebody to be homosexual and I basically have a whole bunch of images, pixilated image maps of pride flags, and all this other kind of stuff. And I say I want you to check everybody in my country's iCloud photos looking for this type of image. That's the problem.
 
 Henry : That's what you by a back door.
 
 Mike : That’s what I mean by a back door. At some point, they can envision that somebody's going to come to them and say we just busted this guy. We think he's transmitting CSAM and we want to get access to his phone. Now Apple can say ‘Hey, no worries. We've already built the back door. Rather than basically losing control of the cracking of iOS to an external party like NSO group.
 
 Henry : Interesting. So Geoff, do you feel that this is a violation by Apple of the users’ privacy? The reason I ask that is because apparently, they're not doing it on your phone, but they are scanning whatever you upload into their cloud.
 
 Geoff : I think the challenge is that cloud service providers like Apple and others have painted themselves into a corner or another tired analogy; they’re stuck between a rock and a hard place. In many Western nations, there are laws that say it is illegal to host this content. Generally what that means is it's illegal to host it for the purpose of sharing it. So if I upload evil scumbag and I upload CSAM with the intention of being able to share it with other evil scumbags and charge money for it or whatever, that's where this law sort of exists, which is why all this content is hosted on the dark web or shady websites in Russia or wherever where the legal jurisdiction is a little more tenuous. But with Apple being this large American corporation, this flagship, they run a little bit of foul of these laws where the FBI can come knocking and say ‘What are you doing about this content? Hey, what about these laws?’ And tap on them. Now they are saying ‘Hey, we want all your pictures and all your content. But oh, some of what you are sending us is pretty shady and we're going to do something about that. How they navigate those waters is a tricky thing. As Mike says if other nations say Hey, you're hosting this other kind of content that's illegal in our country like Pride or like protests against the government or a hundred women's rights or a hundred other things. What are they supposed to do? So these cloud providers that are basically saying give us all your content and then are turning around and hearing from governments ‘Hey, that content is illegal.’ They're in a tricky place for them to be. They scramble around with situations like this. Well, we'll compare the hash of your picture to the hash that we have from somewhere else and we'll take it down and… But it’s messy. The root of it all is the encouragement of all of these cloud providers, small and large, should just say, give all your stuff. We want to take all your stuff. We want to make it all convenient. Now they have to deal with the outcomes of that policy.

Henry : I've been thinking about something and I'm not sure if this has any impact on what we're talking about, but where are Apple’s cloud servers? Are they all in America or are they positioned the world and does that impact what's going on?
 
 Geoff : They're all around the world and they are replicating themselves all around the world to provide redundancy and to provide rapid response. Generally in cold places, because they generate a lot of heat, and places where there's abundant and power. They're easy to cool. But they are literally all around the world. These cloud providers are very cagey about where their servers are. You'll be driving through somewhere cold and suddenly be this huge bunker of a grey building with some numbers on the outside. That happens to be Google's data centre. They don't advertise that because they don't want some nut job to show up with some bombs. They don't want to be any kind of a target. This data is ephemeral, so it's just moving back and forth and all around as needs require. If you're a corporation and you contract with Microsoft, they're probably the best about saying if your data must be in the US, we'll keep it in the US. They're willing to sign a contract to that effect. But the consumer-focused vendors like Apple, Google, others, Dropbox, to some degree, they tend to not make such commitments. They tend to just stick the data and move it around as they see fit.

Mike : I want to touch on that cloud piece. Going back to something you just said, Geoff, about this CSAM material, normally residing on the dark web or crazy backdoor websites in countries with less, let's say legal parameters around data. What you're talking about in both of these scenarios is the hosting of the data, but I would argue that there's a difference between Apple advertising iCloud as a personal storage place for your data. They're not saying that they say, for example, hosting your photo sharing, because you could take your pictures on your phone and they're up in iCloud, and then you can take those pictures and send them through Instagram or through an SMS or through a dark web CSAM website. When I think about this, I can see the governments coming to Apple and saying, if you're hosting, let's say some photo website or Apple podcast or something with objectionable content, we're going to make you take it down. But saying we want to be able to proactively do some kind of a scan of content, in containers that you are hosting on behalf of your users. To me, what that is that's a complete backpedal from the message that Apple's been pushing out for the last since that iPhone thing with the terrorist in Florida. Tim Cook is like your data shouldn't be used by people to make money. He's put in the little ad blockers for Facebook and all this other kind of stuff. Oddly enough, they don't work on Safari, which is very interesting. But you know what I mean? To me I look at this CSAM announcement as something that to me is a step too far. I could see how everybody would be like ‘Hey, Apple’s committed to not hosting any of this data in any kind of public forum they have’. Now they're saying, we're just going to basically, like I said, build in this back door or whatever you want to call it to basically allow us, and let's put, let's face it on behalf of some entity, to reach in to people's private advertised private storage containers, where they are not taking that container and exposing it to the public. They're expecting to have some re reasonable privacy for their information. All it's supposed to be is a copy of your hard drive. They're saying ‘We're going to now go into this copy of your hard drive and compare the photos’. That to me is where the subtle difference is. That's why I don't think Apple needed to make this call. But I think I know why.
 
 Henry : Why do you think, both of you, Apple chose to announce something so, I guess, groundbreaking or unusual right now? 


Mike : Go for it, Geoff. 


Geoff : Well, two reasons. The first reason which I won't go into again because I already talked about is them being stuck between a rock and a hard place. I believe that someone from the justice department probably knocked on lawyer's door and said ‘Hey, you're going to be fined a billion dollars if you don't take care of this’. A few million is small change for Apple but I would guess, and I'm just postulating here, I would guess that they were told you got to do something about this because you're in contravention of American law and Federal law. I would guess that was the first reason. The second reason is, this one is tricky, I would suspect for Joe and Jane six pack this policy is very popular because so many, particularly Americans, less so Europeans, Canadians has always stuck somewhere in the middle have this attitude of I don't mind my privacy being invaded because I have nothing to hide. You heard this after nine 11, I'm not a terrorist, so I don't care if the government listens to my phone calls because I got nothing to hide. Especially around this topic of CSAM where there is tremendous hatred of the individuals who do this as there should be. Most of the blow back against Apple will come from privacy- minded, technical people like us who are questioning this. But the vast majority of particularly their American users will just say ‘Well, I got nothing to hide and this sounds pretty good if they're putting bad guys in jail then go for it, Apple. 

Henry : I see.

Geoff : In the end all of these things come down to a business decision of will we sell less iPhones or less Apple products, less Apple services? Will people buy few iCloud subscriptions if we do this? You can bet they've calculated that ground out their pencil and said ‘We're probably okay.’ That's why I think it is the way it is. 


I will just add one other comment around this whole subject of cloud storage and cloud this and cloud that. Part of it is if you are so concerned about this, and you should be, whether it's Apple, whether it's Google, whether it's Dropbox, whether it's any of this, stop uploading stuff to the cloud. You can swipe one little setting in an iPhone and turn this off and turn off synchronizing everything to the cloud. Part of it is people just love the convenience of this. You drop your iPhone in the lake, you go get a new one, you enter your Apple ID, you press restore and if you're on fast WIFI magically, 30 minutes later, your phone is identical to the one you had before. It's everything, your WIFI settings, all your pictures, all your logins for all your applications, the applications themselves, your emails, everything is back there. Why? Because all that junk went into the cloud, all that junk was stored in servers associated with you. People are like this is awesome. I didn't have to do anything. Well the price of that incredible convenience that you're giving up is your privacy. All that stuff is mapped to you. You enter your Apple D back in and away you go. In the case of my phone, I don't have any of that turned on.


Henry : Neither do I.


Geoff : If I lose my phone… I have local backups. I've backed it up at home to my little storage. All my photos are backed up to my local storage. But if I lose it as opposed to it magically coming back, it takes me a couple of hours to read all that. But the convenience isn't necessarily there, but the privacy is there. We can all remember 20 years ago when we bought a new computer, you could take up to a day to get the thing all set up back the way you wanted it. That's because it was all on your USB hard drive, or maybe you plug the two computers together and you copied all the data across. Maybe you were got out a huge box of floppy discs and put it all back on, or your Iomega zip disc. That is part of it as well. You're not forced into this, but it's time to start thinking about just decentralizing your data. Stop putting all your data into the cloud, start saving some of it locally. Once it's up there, you just don't know what's happening with it.
 
 Mike : Without a doubt wise words, it's true. Convenience should never be more important to you than your privacy and your security. 


Henry : People are lazy. 


Mike : I think Henry, what it is; they just don't know the alternative. I think the convenience thing has almost dulled everybody and the outrage of social media just dulled everybody to all of these notifications of things that happen. With this thing, this notification about the CSAM, let's remember, Geoff just said, I've not done anything wrong so I don't care if they have all my data. What I like to come back, my ripe post, shall we say to that is, what happens when the definition of what is wrong changes? If you've been on iCloud for the last 15 years and you've been taking photos and you've been backing everything up to iCloud and you've been doing all of this, this announcement is basically saying that any data… they're just using this picture thing as a convenient way to say it in such a way that they hope people won't object to it. 


Mike : What it is if they say that they can go into your iCloud photos and compare them to a map, what stops them from going into your iCloud documents or your iCloud everything? That's the slippery slope. Let’s take two very appreciate, yet different examples. Right now you've got a group of people, depending on the numbers; it could be thousands, hundreds of thousands or millions of people in Afghanistan. Who could potentially be linked somehow to having just taken a selfie with a NATO soldier who was guarding their local bizarre over the last 20 years and now they’re enemies of the Taliban. You could argue, surely Apple's not going to take a phone call from the Taliban saying I want you to scan for every picture of somebody that has a UN crest in it, an American flag. But because they're building it in to the iOS, it means that some hacker could get a hold of that service and could then offer that service. Somebody who's not as reputable now knows that there is a back door into iOS because that connector between any individual iOS cloud container and this photo sharing, that's a back door. Anybody who gets access to that could then market that service on the dark web or wherever to disreputable people. 


I think that this is a really powerful point. People really need to realize that you may not think that you've done anything wrong, but when your State outlaws abortion or masks and you have pictures in your thing with either of those, what's to stop somebody… wait maybe not the legitimate government of your State, but maybe somebody who has an axe to grind with mask wearers or people who've had abortions now knows that the capability exists to do a proactive scan. That is the key to this whole thing. Apple is just completely doing it about-face. They're saying ‘No, no, we're going to build a backdoor in, like I said before I think it has everything to do with this NSO group. By building this backdoor in Apple is saying ‘You can come to us instead of an NSO group. And we'll work with you.’ They're basically going to say ‘We're going to try to control who that person is.’ The NSO guys presumably will sell to anyone. That's why I think they're doing it right now.
 
 Henry : Final question. Geoff, I’d like to hear from you first on this. The back door, the way it looks right now is essentially scanning whatever you put into iCloud. Is there a possibility that Apple can somehow look into your phone, whether you've backed up to the iCloud or not, or is that still not possible? Or do we even know?
 
 Geoff : We don't know. When we're in the software business and people say is this possible to do blah, blah, blah and it's like well, it's software anything's possible. So it is entirely possible for a smartphone manufacturer to deploy code onto a phone that scans everything you're doing and does this, that, and the other thing. Particularly, in the case of Apple, they own the entire Walled Garden from soup to nuts, I’m getting a little tired of analogies this week, but, it's all there. It's possible. Is it likely right now? I'm inclined to say no because the blowback of that was the case would just be tremendous. iOS 16 or 17 comes along and they decide to put them something secret in there. You'd never know. There's so much traffic going back and forth on your phone, the operating system's doing so many different things. It’s indexing your files for search all the time. It's doing this, that, and the other thing. Who knows?
 
 Henry : Mike, anything to add?
 
 Mike : I agree. I think you'd be surprised. I think we would all be surprised at what, primarily, Google and Apple, the two mobile OS providers, the mainstays, I think we would all be surprised and shocked at what they can already do and what they can already get. While I'm inclined to believe that they're not doing it, like Geoff said, I actually believe that they already can. The only difference is they just haven't built an API for it. They already can get everything. I think that to round this whole thing up and tie it up in a bow, the idea with this is anything that you do not store locally on your own machine is going to be subject, more easily subject, shall we say. Somebody can always hack into your computer at home, or steal the thing. But if you're putting it into a cloud, then: A-  it's always going to be subject to these types of things. Right now you think are all these CSAM folks that use iPhones now just throwing them in the river, deleting their iCloud account, and going and buying Android phones? Tell me the justice department hasn't served this same letter on everybody. Why would Google then not do it or have to do it? That's where I think if what you see is that it's just Apple, it's literally just them saying we're capitulating in terms of we want to control anybody hacking our OS. I think that's it. If you put it in the cloud, it's not yours. Don't be surprised if someday something that you posted, it could be 15 years ago, comes back to bite you and you lose your job, or you lose your marriage, or you lose your reputation. That's already happening and it's just one of those things. The cloud is going to always be there, and always going to be an access point for our identities and what we do. 

Henry : Thank you very much, gentlemen. I can tell you it's certainly made me think a little bit more about what Apple is planning on doing. Certainly stopping the weirdos out there is a great thing, but you got to look behind the surface and read the fine print and keep that in mind when you store your data and consider privacy. 


Mike : Absolutely.
 
 Geoff : Precisely.
 
 Henry : Thank you, Geoff, and thank you, Mike.
 
 Mike : Thank you, Henry. 


Geoff : Thank you.