Apple announced today that it is delaying rollout of its controversial new surveillance technology that sought to identify child pornography.
They’ve updated their previous press release with this disclaimer at the top:
“Update as of September 3, 2021: Previously we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them and to help limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers, and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features.”
Initial concerns were wide-ranging:
- How accurate would this scanning be? This was a particular concern when the consequences of a false match are so extreme. A misclassification could potentially mean a phone owner would be reported to law enforecement.
- Many people didn’t like Apple accessing their photo libraries or scanning their media.
- Apple could review your searches (potentially in real time) and block them or report them.
- A number of privacy groups were concerned that once this “thin wedge” was installed, it could be misused in the future.
Note that Apple has only announced they are taking “additional time” not necessarily canceling the initiative. What are your thoughts on this?