Apple Postpone the CSAM Photo-Scanning System

By Bill Toulas / September 4, 2021

Apple has sent emails to media publications informing that the CSAM (Child Sexual Abuse Material) scanning system planned for imminent roll-out will be launched later after all, as the firm needs more time to discuss the implementation with privacy consultants. Obviously, the backlash from every corner of the community sent a clear message to the tech giant that the CSAM, as well-intended as it may have been presented to be, would tarnish the privacy framework of the iOS, potentially creating a negative market trend for the company.

For now, all we know is that the CSAM won’t be introduced in the upcoming iOS 15, but when it actually launches remains a question. Apple will have to balance between transparency and secrecy, as the system will have to be verifiable in terms of its trustworthiness in the privacy-ascertaining elements. However, it’ll still have to keep some things undisclosed to not allow any margin for overriding or bypassing it. And with Apple being Apple, sharing numerous technical data around any of its proprietary products is atypical, and as such, overly unlikely.

As it was revealed and admitted by the company itself last month, there is already a child abuse scanning system in iCloud Mail, something that was undocumented and generally not clearly communicated. The CSAM could be introduced to other system components gradually in a similar manner and not by purposefully drawing media attention, thinking that this would be perceived as a positive development. Apple hasn’t thought this through as the feedback showed, so for now, they are retracting the plans to do it openly, at least.

This is why some privacy advocating organizations like the EFF, for example, aren’t celebrating the announcement for postponement and are asking for the complete abandonment of the CSAM plans. As EFF points out, it’s positive to see that Apple is listening to the concerns of its customers and the various organizations that objected to the CSAM. Still, they don’t see a way to implement such a system while respecting people’s privacy.

One fundamental problem that underpins all types of backdoors remain, and this is the ability to abuse it and utilize it for purposes that go beyond its core functionality, like to spy on people. Another one is the poor reliability of the system that could generate false flags and cause legal troubles to innocent people. The potential for both has been proven already thanks to the work of cryptography scientists who dived in the CSAM and almost immediately found flaws.

For a better user experience we recommend using a more modern browser. We support the latest version of the following browsers: For a better user experience we recommend using the latest version of the following browsers: Chrome, Edge, Firefox, Safari