Why Apple’s crackdown on child abuse images is no easy decision

Apple will inspect every photo uploaded to the cloud by US users of iPhones and iPads to identify images of child sexual abuse, and will report any found to a nonprofit that investigates cases of child exploitation. The new measure has been praised by child welfare charities but condemned by privacy campaigners, who believe it opens the door to other styles of surveillance from authoritarian governments.

Rather than examining the photographs themselves, Apple’s neuralMatch software will include an algorithm that creates …

About the Author

Leave a Reply

Your email address will not be published. Required fields are marked *

You may also like these