Child abuse! It’s the most common abuse that remains unnoticed. Be it online or through physical exploitation, the incidents of child abuse have increased exponentially. This year marks 16 years of the United Nations Convention on the Rights of Children (UNCRC), yet in the past two decades, the child exploitation is gaining momentum. The definition of child abuse is not confined to physical abuse, but it also includes online sexual abuse. Most often, it is the latter that remain unreported.
With technology seeping into every sector, it is very easy to manipulate and morph the photo of a child for lucrative gains.
In its 2018 report, the UN cited 18.4 million cases of suspected online abuse. The report also states that over the past decade, the Child Sexual Abuse Material (CSAM) has increased. An investigation by The Newyork Times revealed that big tech companies have reported exploitation of over 45 million images of children. Moreover, the Internet Watch Foundation found that 39% of CSAM online are the victims under 10 years of age and 43% displays the act of extreme sexual violence.
The exploitation of child doesn’t end here. Another report by the Internet Watch Foundation in the UK cites that online child abuse has surged by 50% during the COVID 19 lockdown. And despite the many strategies implemented to mitigate the incidents of child abuse, undoubtedly the international organizations have not reached its goals.
Since the technology is invasive, one question which often gets asked is “Will artificial intelligence help in reducing the online child sexual abuse?” or “Can AI stop child sexual exploitation?”. While there have been differential opinions in this topic, the companies are now introducing the AI applications that can thwart online child abuse.
On that note, Analytics Insight brings you top AI technology that can be conducive in mitigating the child sexual abuse.
1. Safer– Developed by the Artificial Intelligence Company Thorn, this AI-powered tool detects child abuse images with around 99% accuracy. Safer allows tech platforms to identify, remove, and report child sexual abuse material at scale, a critical step forward in the plan to eliminate CSAM shared at TED. Safer has already enabled the takedown of nearly 100,000 known CSAM files while in its beta phase, and we’re just getting started.
Its services include:
- Image Hash Matching- This generates the cryptographic and perpetual hashes to match it with the past CSAM hashes, to identify images of exploitation.
- CSAM Image Classifier- Uses Machine learning model to identify whether an image qualifies for CSAM.
- Video Hash Matching: It generates cryptographic and perpetual hashes for videos to match with past CSAM hashes.
2. Child Safe.AI– It is an artificial intelligence platform that monitors and models the child exploitation risk on the web. Already deployed by US law enforcement, this AI-platform, actively collects signal of exploitation threats from online ecosystems, where they are known to occur, modelling that signal into probable risk. It assists the organizations in mitigating the risk of online child abuse by observing millions of conversation, content and photographic signals.
3. Spotlight– Developed by Thorn, this technology uses predictive analytics to identify the victims of child sexual abuse and child trafficking. Moreover, by analysing the web trafficking and data gathered from sex ads and escort websites, it identifies the potential victims of human and child trafficking. It is already in use by US Federal department to solve complex child trafficking cases. This AI-based tool has helped to identify 14,874 child victims of human trafficking in the past four years.
4. AI Technology by United Nations Interregional Crime and Justice and Research Institute (UNICRI)– This technology uses AI-tools and Robotics for identifying the location of long-missing children, to scan illicit sex ads, and disrupt human and child trafficking risks. This technology is still evolving and has not been in much use presently.
5. Griffeye– This technology uses computer vision tools like facial recognition and image recognition to scan images on the parameters of nudity and age. This is already deployed by the US Federal agencies to identify and thwart CSAM.
6. Google’s AI tool– In 2008, the tech giant Google introduced an AI tool to mitigate online child abuse. By using deep neural networks for image processing, this technology assists reviewers and NGO’s in sorting through many images by prioritizing the most likely CSAM content for review. This AI-tool also helps classifier in keeping up with offenders by also targeting content that has not been previously confirmed as CSAM.
7. ai– This technology utilizes computer vision by training on real CSAM, and identifying and flagging new images containing child abuse. It is used in conjecture with the hash list and reduces the investigator’s manual workload. This tool also assists in prioritising cases that involve child sexual abuse.
8. Cellebrite AI-tool– The Cellebrite AI-tools utilize artificial intelligence and machine learning algorithms to help investigators in streamlining the process of collating, analyzing and reporting the evidence of child abuse.
Any technology requires human intervention. Experts are sceptical about the use of AI in mitigating child abuse but points out that if the technology gets leveraged without intrinsic biases, then it can prove beneficial to humanity.
Share This Article
Do the sharing thingy