Researchers who found out about the plan were alarmed, however. The plan was described as a compromise between Apple’s promise to protect customer privacy and demands from the US government, intelligence and law enforcement agencies, and child safety activists to help them battle terrorism and child pornography. The program is initially intended to be rolled out in the US only. Apple plans to scan US iPhones for child abuse imagery - Financial Times August 5, 2021ĭubbed “neuralMatch,” the system will reportedly scan every photo uploaded to iCloud in the US and tag it with a “safety voucher.” Once a certain number of photos – not specified – are labeled as suspect, Apple will decrypt the suspect photos and inform human reviewers – who can then contact the relevant authorities if the imagery can be verified as illegal, the FT report said.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |