The news that Apple will launch a new child safety feature which scans images for nudity has been welcomed in the UK.

The new safety feature that actively scans all images sent to and from children’s devices will soon be available.

Titled ‘Communication safety in Messages’, it is a new tool designed to warn children using an Apple device who receive or send photos that contain nudity. This feature has been previously launched in the US last December.

If the AI technology detects nudity, the image is blurred and the child is warned about the potential content, while being presented with options to message someone they trust for help.

Apple initially proposed the use of a new set of tools which could be used to detect child sexual abuse material (CSAM) last August. However, the roll-out was postponed following an enormous backlash from privacy groups and critics.

As well as alerts for explicit images on a child’s phone, Apple was also looking to introduce barriers around searching for CSAM. It would also have forwarded alerts to law enforcement, in the event Child Sexual Abuse Material was detected in a user’s iCloud photos.

While Apple said the measures would be privacy-preserving, concerns were raised as to how they could open a backdoor into widespread surveillance and monitoring of content.

Apple has made a number of changes to its image scanning feature since then. The initial announcement said parents of users under the age of 13 would automatically be notified if explicit images were detected, but that is no longer mentioned in the update.

The communication safety feature is also switched off by default and has to be turned on by parents.

“Messages analyses image attachments and determines if a photo contains nudity, while maintaining the end-to-end encryption of the messages,”

“The feature is designed so that no indication of the detection of nudity ever leaves the device. Apple does not get access to the messages, and no notifications are sent to the parent or anyone else.”

CSAM detection features are also being added to Apple apps such as Siri, Spotlight and Safari Search. These apps will intervene if the user searches for queries related to child exploitation.

“These interventions explain to users that interest in this topic is harmful and problematic, and provide resources from partners to get help with this issue,” Apple said.
Siri will also help users who ask how to report CSAM content, directing them to resources on how to file a report.

These new features will assist both children and parents. But parents still need to be proactively engaged checking all of their children’s digital devices. It is essential to maintain a constant supervision of your child’s online activity, as well has having suitable parental controls enabled.

Here at Children of the Digital Age we afford world class education to help protect parents and children against online harm.

You can contact us by email codainfo@protonmail.com or call us on +353 87 1096087

Children of the Digital Age
User Avatar

By Children of the Digital Age

We offer Workshops and Courses both Nationally and Internationally for Parents, Children and Workplace Staff and Conferences, on Cyber Safety, Parental Controls, Online Addiction, Online Privacy, also Consultancy on Social Engineering and Data Protection, Ransome Ware and much more. For further information Please Contact Us codainfo@protonmail.com

© 2021 Children of the Digital Age All Rights Reserved. Children of the Digital Age is a Registered Company No. 582337

Discover more from Children of the Digital Age

Subscribe now to keep reading and get access to the full archive.

Continue reading