Failures to identify child sexual abuse material (CSAM) as a reporting option, difficult-to-locate menus, and requirements that discourage flagging illegal content, discourage reporting.

These are some of the major barriers that have been identified facing people as they navigate popular online platforms.

This also includes survivors of sexual abuse who are discovering their own child sexual abuse imagery online when trying to report CSAM on some of the most popular web platforms.

This is the finding of a new report by the Canadian Centre for Child Protection (C3P).

“As we reviewed each platform, we came to realize these companies nearly always provide users with a clear and formal process for reporting copyright infringement. Yet, when it came to reporting images of being sexual abused, the mechanisms in place were, for the most part, inadequate,” said Lianna McDonald, Executive Director of C3P.

This was prompted by the feedback of survivors, whose child sexual abuse was recorded and distributed online. There were also concerns of private citizens reporting into

C3P undertook a systematic examination of the availability of CSAM-specific reporting mechanisms available on 15 major platforms, including Facebook, YouTube, Twitter, Instagram. It also included adult content sites such as Pornhub, a known facilitator of such content.

With the exception of Microsoft's Bing search engine, none of the platforms evaluated by C3P provided users, with a content reporting option specific to CSAM.

This included online posts, direct messages, or when trying to report a user.

Instead platforms generally opted for non-specific, or ambiguous language for reporting, the C3P researchers found.

Figures by the U.S.-based National Center for Missing and Exploited Children, show tech companies reported more than 69 million CSAM images on their systems last year.

C3P's own tipline,, has also experienced a dramatic increase in public reports of child exploitation during the course of the COVID-19 pandemic. There was 81 per cent spike this past spring.

“For over a decade, the technology sector, has not adequately addressed horrific child sexual abuse imagery being distributed on their services. These titans of tech initially denied the existence of the problem.

Then have since admitting the problem, but then denied that a technological solution exists.

Eventually, they were presented presented with a technological solution. However, they only then begrudgingly and anemically, began to address the problem of sexual violence against children.

More than a decade later, however, many online industry giants still don't have even the most basic safeguards.

These include providing a clear and easy mechanism to report child sexual abuse imagery. This is simply inexcusable, according to Dr. Hany Farid. He is the co-developer of PhotoDNA, and professor at the University of California, Berkeley.

As part of the report, C3P developed five recommendations to clarify and streamline CSAM rreporting process for platforms that allow user-generated content to be uploaded onto their services:

There is a need to create reporting categories specific to child sexual abuse material discovered inadvertently online.

This would include reporting options and easy-to-locate reporting menus.

Ensuring reporting functions are consistent across the entire platform

Permitting the reporting of content that is visible without creating or logging into an account.

Being able to eliminate mandatory personal information fields in content reporting forms

This evaluation of user needs has come just months following the Five Country Ministerial's release of an international set of voluntary principles to counter online child sexual exploitation and abuse.

On March 5, 2020, in coordination with governments from Canada, Australia, New Zealand and the United Kingdom, the U.S. Department of Justice released a set of 11 voluntary principles with a goal to ensure online platforms and services have the systems they need to combat online child sexual exploitation.

However, C3P, along with international child protection allies, point out based on the results of this new report, the benchmarks outlined are not yet being met.

“Tech companies who signed up to Five Eyes' voluntary principles to counter online exploitation have failed the first test and, if their commitments to combat sexual abuse are to be taken seriously, this must be rectified without delay,” said Peter Wanless, Chief Executive of the National Society for the Prevention of Cruelty to Children, the UK's leading children's charity.

Currently the online world is awash with child sexual abuse and exploitation material. In the greater scheme of things, little or nothing is being done to address this.

Worse again, EU legislation due to be enacted on the 21st if December 2020, will impose incredible difficulty in identifying, removing and investigating CSAM content.

You can read the full report or download load it here at

Children of the Digital Age
User Avatar

By Children of the Digital Age

We offer Workshops and Courses both Nationally and Internationally for Parents, Children and Workplace Staff and Conferences, on Cyber Safety, Parental Controls, Online Addiction, Online Privacy, also Consultancy on Social Engineering and Data Protection, Ransome Ware and much more. For further information Please Contact Us

© 2021 Children of the Digital Age All Rights Reserved. Children of the Digital Age is a Registered Company No. 582337