Apple recently announced a new feature to be released in iOS 15 that would allow all photos stored on an iPhone and in iCloud to be scanned for child specific abusive material (CSAM). Theoretically Apple would be able to compare every iPhone photo to known CSAM material and identify people who create and share such villainous content.
What's the big deal? Why is Apple getting so much flack over this decision? On this episode of The Decentralists Henry, Mike and Geoff talk about Apple and their decision to join the rest of Big Tech in readily violating their user’s privacy.
Is this technology the back-door into iOS the swore they would never make?