Sexual Image Removal Tool
The IWF and the NSPCC's Childline UK have launched a Sexual Image Removal Tool for children. This will afford some comfort to parents and children where sexual content has been shared online.
Sharing explicit self-generated sexual images
One of the most difficult issues parents now have to face, is the reality of children sharing explicit self-generated sexual images of themselves with others. Circumstances vary from child to child, regarding why a young person may share a self-generated sexual image.
Some may have sent content to a new, or long-time partner. Often these are then subsequently shared on without their consent. For others, they may become a victim, having been groomed online, blackmailed or exploited in other ways into sharing the content.
Staggering increase
In the UK, the Internet Watch Foundation (IWF) has witnessed reports of self-generated sexual images more than double from January to April 2021, compared with the same period last year. There has been a staggering increase from 17,500 in 2020, to 38,000.
We know from experts who work closely with children in this field that this issue, when self-generated sexual content is shared online it has a devastating impact for entire families.
Many children will feel shame and embarrassment. They are terrified of what may happen to them and their reputation, in both the short and long-term. Many turn the distress they feel inwards, with high reports of self-harm and much worse.
In Ireland legislation has changed to address and criminalize, the non-consensual sharing of self-generated sexual content. However, it will be some time before this will have any positive impact. Primarily, this is due to the normalisation of taking and sharing of sexual content using mobile devices among young and old.
OnlyFans part of the problem
The problem is being made far worse with sites like OnlyFans are monetizing self-sexual exploitation. Some users on the platform are making substantial sums of money, so it will prove to be a difficult task to discourage the behaviour among young people.
Report / Remove
Thankfully, this new sexual image removal tool is available in circumstances where a child has had a sexually explicit image shared online. Childline UK and the IWF recently launched a new tool to help young people remove nude photos or videos of themselves from the internet. The tool, which was first piloted in February 2020. Report/Remove can be found here on the Childline website and used by anyone under the age of 18.
As part of Report / Remove, the user has to verify their age. However, they can expect the same level of confidentiality that they would from all their interactions with Childline; they do not need to provide their real name to Childline or IWF if they don't want to. Childline will work to ensure all young people are safeguarded and supported throughout the whole process.
Digital fingerprint technology
The tool has been developed in collaboration with law enforcement to make sure that children will not be unnecessarily visited by the police, when they make a report. A young person can make a report anonymously, at any time of day and the IWF will then work to have the image removed if it breaks the law.
A digital fingerprint known as a “hash”, can be created from the image. This will then be provided to tech platforms to help ensure the image is not shared, or uploaded online. Any young person who makes a report should also receive feedback on the outcome of their report in one working day from the IWF via Childline.
We strongly recommend parents have an open discussioon with their children and teens about the serious potential for harm caused by sharing self-generated sexual content with others. Research suggests the most vulnerable group are girls between 9 – 15.