Apple fights child abuse by scanning personal photos on iCloud
News

Apple fights child abuse by scanning personal photos on iCloud


Anna Savelyeva

Photo: Bloomberg

Jane Horvat, Apple’s senior director of privacy, confirmed this fact. The company automatically views the images stored in the cloud for some help
investigations without giving users access to smartphones.

According to Jane, Apple is often confronted with authorities and various services that want to get data during an investigation, but it’s also important to protect their users and their personal
information. To avoid touching the keys of end-to-end encryption, image mapping technology is used to help locate and report sexual abuse material
children.

“Like email spam filters, our systems use electronic signatures to look for suspicious images. Accounts used to store such accounts
photos that violate our terms of service and are subject to immediate removal. Apple strives to protect children across our ecosystem wherever our products are used, and we
we continue to support innovation in this area. ”

According to Horvath, terrorist material and child abuse are “disgusting”, and no one in the company wants to see this kind of material on their platform.

Changes to our privacy policy took effect in 2019. At the time, Apple announced that it could scan images for child abuse and sexual exploitation.
Under the new rules, a company has the right to use personal information of users for the security of the account, in particular, to preview and scan
materials to identify potentially illegal content.

Telegraph

law, information security

Apple

Leave a Reply

Your email address will not be published. Required fields are marked *