A recent report from Thorn, a technology nonprofit, reveals that minors are increasingly taking and sharing sexual images of themselves, both consensually and coercively. The report also highlights an increase in risky online interactions between youth and adults. This aligns with the National Center for Missing and Exploited Children’s (NCMEC) CyberTipline, which has seen a 329% increase in child sexual abuse material (CSAM) files reported in the last five years.
Several factors contribute to the rise in reports, including the deployment of tools like Thorn’s Safer product, which uses hashing and matching to detect known CSAM. Online predators are also becoming more brazen and using novel technologies like chatbots to entice children. Additionally, there is a rise in self-generated CSAM (SG-CSAM).
Hashing and matching is a crucial technology in combating CSAM. It converts files into unique hash values and compares them against hash lists of known CSAM to identify and remove illicit content. Safer, a tool built by Thorn, offers access to a large database of known CSAM hash values and enables the sharing of hash lists among technology companies.
To eliminate CSAM from the internet, tech companies and NGOs must work together. Content-hosting platforms play a vital role, and Thorn is committed to providing tools and resources to combat child sexual abuse at scale. In 2022, Safer identified over two million pieces of CSAM on their platforms, highlighting the importance of CSAM detection tools across multiple platforms.
The more platforms that utilize CSAM detection tools, the better chance there is of reversing the alarming rise in child sexual abuse material online.
Source link
