Apple Asked by Policy Groups to Drop Inspection of iMessages for Abuse Images

IPhone manufacturer and giant technology company Apple Inc. (AAPL) has been asked by more than 90 policy and rights groups through an open letter to drop its plan of scanning children’s messages for images portraying nudity and adults phones for sex child abuse images.

Though these capabilities are intended to protect children and to reduce the spread of child sexual abuse material, we are concerned that they will be used to censor protected speech, threaten the privacy and security of people around the world, and have disastrous consequences for many children,” reads part of the letter.

The policy and rights groups comprises of some from the United States and others from abroad. Many of the groups are worried about the effect of this move considering that different countries have different policies in matters relating to data encryption and privacy of personal user data.

Through its spokesperson, Apple said concerns on the privacy and security of user data have been addressed in a document it released last week highlighting the sophisticated architecture of the scanning software further urging the public to resist the urge of subverting it.

Additionally, Apple said it would turn down demands to expand the image-detection system beyond images of children flagged by clearinghouses in various jurisdictions.

Leave a comment