iOS 15: Apple planning to release new updates for fighting CSAM.
Share:

As the fact that Apple never lags behind when we talk about innovation and experimentation. So when rumors of Apple creative a new child-secure software arose. We were excited to hear about it.

According to reports, Apple is working on a program that would scan your iPhone photographs for child sexual abuse material (CSAM), which includes audiovisual materials connected to child pornography and other forms of child abuse.

To know more about CSAM. Click "here"...

This new functionality, which will be unveiled shortly, will be used on the client-side, on the user's phone, to search for certain perceptual hashes. If they are discovered to be of good quality, Apple will upload them to its servers. Although it is unknown if this system may be exploited, the device-side checks are intended to preserve the user's privacy.

According to Matthew Daniel Green, an Associate Professor at the Johns Hopkins Information Security Institute in the United States. Said - “This technology is still under development but it has numerous applications and we see it becoming a critical component in adding surveillance to encrypted communications systems”.

Apple will begin with non-E2E pictures, which are photos that individuals have shared on the cloud. Therefore, it does not infringe on anyone's privacy. However, "You have to wonder why anybody would design a system like this if scanning E2E pictures weren’t the aim," Green said in a lengthy Twitter thread.

Apple's new tool may cause user anxiety since it may produce false positives even if there are enough levels to prevent misuse. In addition, governments may also be able to exploit the system to seek material that might influence public attitudes about political involvement rather than just unlawful child content.

Previously, Apple was discovered to have used similar hashing techniques to search for child abuse material in emails sent by iPhone users. Last year, it was also alleged that the Cupertino business had abandoned encrypted backups on its iCloud in order to discreetly give a backdoor entrance to law enforcement and intelligence organizations.

Nevertheless, the new step appears to be done with privacy in mind since it will be implemented on the user's device without requiring photos to be sent to the cloud. The exact extent of the tool is unknown because Apple has not yet provided any formal details, although Green hinted that an announcement might happen this week.

A clear explanation of the tool

Apple plans to build a program that checks users' photographs for child abuse. A highly capable AI will go through every message or media that’s sent to a user. If there is even a hint of CSAM material the user will immediately get notified about it. Any detected CSAM content will then directly be stored in iCloud which will later be used to track the sender and penalize them.

Reactions of Masses

After listening to this news, many people have numerous doubts in their minds about what the scoop is about this tool? Is it a threat to the privacy of users? What if the personal information is exploited? Many more questions will be on your mind, right. Well, there is no official statement from Apple about this, but Green hinted in the tweet that Apple may make the announcement this week.

On the face of it, this new feature might help the judiciary units to detect and reduce CSAM content throughout the world. But, there always remains a question. What If the step backfires? Amidst all these speculations Apple seems to be rigid with its decision about the new feature. As they are sure it will help make the internet a better place for everyone.

Winding-up

CSAM is a very real threat that affects numerous people over the world. In any countries producing, creating, and distributing CSAM content is a crime that could levy up to 25 years in prison.

So, will this plan work out? Well, it’s too soon to write off anything. But, we do hope Apple will take the necessary steps to make this new feature more appealing for its consumers.