With the arrival of iOS 15.2, Apple introduced a new communication safety feature in a message aimed at protecting children using iPhone devices from photos that describe nudity. So far, this feature is limited to the United States, and in accordance with the threshold, will soon be extended to Australia, Canada, New Zealand and England as well. The Guardian reports that children's safety features will arrive "immediately" at U.K., while Dailymail
claims that safety devices have been launched in this country. This feature will be activated through software updates with the minimum iOS 15.2 operating system version, iPados 15.2, or MacOS Monterey 12.1.

Communication security in messages automatically scans images received in the message application, and if detecting traces of nudity, automatically obscure the image preview. The system then shows the request that asks the user if they really want to see it, and also direct them to the resources they might need to help them in deflecting this kind of content. The same warning system kicks in action when the user tries to send a picture
with a trace of nudity. This feature must be activated manually from the screen time settings, but the iPhone child must first be registered in the family sharing group.

Available as a toggle called "Check sensitive photos" in the communication security section, this feature offers three solutions to handle scenarios where troubled images are sent or received. The first action is "sending someone someone" - someone becomes a reliable adult that children use iPhones can believe to help.

There is an option to block people on the other side of a conversation with a knock. There are also "other way to get help" buttons that lead to online safety
resources that are useful for the right guide in such sensitive situations. Additional resources are also ready to hed Safari Search when they are looking for guidance related to child exploitation.

When Apple initially announced the plan last year, this system included a warning system to notify parents every time their ward sent or received a picture
that described nudity. Experts raise concerns that the warning can come out in children with risky LGBTQ. Apple then dropped warning functionality, but the rest of the system remained intact.

It is also worth indicating that all image scans occurs locally on each iPhone device, and Apple cannot access photos sent or received by any user. Apple also has a similar system for detecting CSAM (child sexual abuse imagery) in photos stored in iCloud, but following a big pushback on privacy issues, the plan was delayed.

Author's Bio: 

deepak singh