Contact information

opp. vedanta hospital, Chanakya Nagar, Kumhrar, Patna, Bihar 800026

We are available 24/ 7. Email us Now.

Three weeks ago Apple had announced the roadmap for the three changes, it will be implementing this year. The announcement includes three updates that will be rolling out in the course of this year for iPhones and iPads. All three rollouts will be intended to curb CSAM from Apple’s ecosystem.

What does CSAM actually mean?

CSAM is an acronym for Child Sexual Abuse Material. It is used for indicating an act of sexually explicit conduct involving a minor (person less than 18 years), which is being spread online using technology.

Why Apple is involved in this matter?

Apple had announced three weeks ago that, Apple wants to fight against the spread of CSAM online, and is taking more steps to stop it, at least from its ecosystem.

Why Apple’s move towards stoping CSAM became controversial?

Apple always braces its reputation on privacy. Apple had always focussed on promoting the privacy of its users, whether it is encrypted messaging across its ecosystem, giving user power to control the collection of data by 3rd party apps and fighting with law enforcement agencies demanding user records.

But with Apple focussing on curbing CSAM from its ecosystem, they are comprising the user privacy, it is what the experts are saying.

What Apple really announced three weeks ago that became a matter of debate?

As mentioned above, there will be three rollouts rolling throughout the year for curbing CSAM from Apple’s ecosystem.

The first rollout will affect Apple Search and Siri. If a user searches for terms related to child abuse, Apple will redirect them to resources for reporting it or getting help for getting those deleted/banned. This rollout will be pushed along with iOS 15, Watch OS 8 iPadOS 15 and Mac OS Monterey on various Apple products in the later phase of this year. This feature is not that controversial but had been a debating point between experts.

The second rollout will affect the Apple messaging ecosystem. The rollout will add a parental control feature to the messages. Basically, it is meant for the protection of children from CSAM. Apple will provide options to parents for whether opt for their device or not. If parents opt-in for the parental control feature, the incoming and outgoing pictures on devices with users under 18, will undergo scanning with an image classifier specially designed for detecting CSAM using nudity detection filters. And the controversy sparks here, the scanning of “private” pictures of a user is a breach of privacy also, which contrasts Apple’s claim of providing the most secure environment for data safety. This rollout will be pushed along with iOS 15, Watch OS 8 iPadOS 15 and Mac OS Monterey on various Apple products.

The third rollout will focus on barring CSAM from the iCloud. This rollout will add a feature to the iCloud that will scan the images and pictures in iCloud and search for CSAM, if the scanner finds the CSAM then it will report it to the Apple Moderator and they will pass it on to National Center for Missing and Exploited Children or NCMEC. This transfer of data from iCloud -> Apple Moderator -> NCMEC had sparked a point of debate between the experts and turned out to be the reason for huge backlash for Apple. Although Apple claims that this scanner is designed only to scan nudes and CSAM from iCloud, the experts deny the claim and take it as an intentionally put backdoor in one’s privacy.

What’s Apple take on this?

Apple however claims that it will restrict any attempt of abusing its systems, it also equipped a lot of safety measures such as, parents will not be able to enable alerts for older teens in Messages, iCloud’s safety vouchers are encrypted, a threshold for alerting moderators, and searches are US-only and strictly limited to NCMEC’s database.

Apple’s press release:

Apple’s CSAM detection capability is built solely to detect known CSAM images stored in iCloud Photos that have been identified by experts at NCMEC and other child safety groups. We have faced demands to build and deploy government-mandated changes that degrade the privacy of users before, and have steadfastly refused those demands. We will continue to refuse them in the future. Let us be clear, this technology is limited to detecting CSAM stored in iCloud and we will not accede to any government’s request to expand it.

What experts are saying?

Experts believe that the feature of scanning one’s iCloud and personal messages is a major blow to what Apple boasts its stake off i.e. privacy.

A pair of researchers who had worked on the similar CSAM detection software quoted that:

We wrote the only peer-reviewed publication on how to build a system like Apple’s — and we concluded the technology was dangerous.

states Jonathan Mayer and Anunay Kulshrestha, the two Princeton academics behind the research.”

Join Telegram Channel
Share:

Keshav Agarwal

Very keen to technology. Know many programming languages. Worked on various innovative projects. Trying to get deeper into the Tech Industry. Modder by choice, programmer by passion.

Leave a Reply

Your email address will not be published. Required fields are marked *