Apple Disagrees with Government Demands to Utilize Child Abuse Scanning for Surveillance

Apple has defended its system that is able to scan the iCloud to check if there is any illegal CSAM or child sexual abuse material present or not. A controversy raised that the new system reduces user privacy, and government may use it to surveil citizens. Apple announced that it has begun testing the new system that uses cryptography to find out when users upload child pornography to the cloud storage. The system can do it without knowing about the user photos details that are stored on its server.

The company has reiterated that its system is quite private than most other companies as its system utilizes both software and servers that will be installed on iPhones of people through the iOS update.

Technology commentators and privacy advocates are tense that the new system of Apple can be utilized in some countries to check for other images types like pictures with political content. Apple posted that governments cannot force for adding non-CSAm images to the hash list. The company said that it would reject any such demands and the CSAM detection capacity is solely for detecting CSAM images that are stored inside the iCloud photos. Apple said in a document that they have faced immense demands with deploying government-mandated changes which will degrade the users’ privacy, but it has rejected it.

The company also said that the technology is restricted to the detection of CSAM image only, and it will not expand it on the government’s request. Some cryptographers think that countries like China will pass laws to include politically sensitive photos also. Apple CEO Tim Cook stated that the company follows all the laws of the country wherever it conducts business.

A Reputation for Privacy

The reputation of Apple for defending user privacy has been cultivated largely over the years. But the controversy over the new system about surveilling users is threatening the public reputation of Apple for creating private and secure devices. Critics are worried that the system will partially work on the iPhone instead of just scanning pictures uploaded to the server of the company.

Ben Thompson, a technology commentator, wrote that Apple’s decision is disappointing as it impacts user privacy and decreases the trust of users on their devices. However, Apple still continues to defend that the system is a genuine improvement that will decrease the amount of CSAM while still protecting user privacy.

Related Articles

Stay Connected

0FansLike
2,956FollowersFollow
0SubscribersSubscribe
- Advertisement -spot_img

Latest Articles