Skip to content

We’re Creating a Safer Internet, Together

June 25, 2024

4 Minute Read

At Thorn, we’re dedicated to building technology to defend children from sexual abuse. Key to this mission is our CSAM detection solution, Safer, which allows content-hosting platforms to find and report child sexual abuse material (CSAM) on their platforms. In 2023, more companies than ever deployed Safer. This shared dedication to child safety is extraordinary and it helped advance our goal of building a truly safer internet.

Safer’s 2023 Impact

Safer’s community of customers spans a wide range of industries. Yet, they all host content uploaded by their users.

Safer empowers their teams to detect, review, and report CSAM at scale. Doing so at scale is crucial. It means their content moderators and Trust and Safety teams can find CSAM amid the millions of pieces of content uploaded to their platforms every day. This efficiency saves time and speeds up their efforts. Just as importantly, Safer allows teams to report CSAM to central reporting agencies, like the National Center for Missing & Exploited Children (NCMEC), which is critical for child victim identification.

Safer’s customers rely on our predictive artificial intelligence and machine learning models and a comprehensive hash database to help them find CSAM. With their help, we’re making strides toward eliminating CSAM from the internet.

Total files processed

More files were processed through Safer in each of the last three months of 2023 than in 2019 and 2020 combined. These inspiring numbers indicate a shift in priorities for many companies as they strengthen their efforts to center the safety of children and their users. 

In total, Safer processed 71.4 billion files input by our customers. This 70% increase over 2022 was propelled in part by the addition of 10 new Safer customers. Today, 50 platforms, with millions of users and vast amounts of content, comprise the Safer community, creating a significant and ever-growing force against CSAM online. 

With 57+ million hashes currently, Safer delivers a comprehensive database of hashes to detect CSAM. Through SaferList, customers can contribute hash values, helping to diminish the viral spread of CSAM across platforms within the Safer community.

Total potential CSAM files detected

Our customers detected more than 2,000,000 images and videos of known CSAM in 2023. This means Safer matched the files’ hashes to those in a database of verified CSAM, identifying them. Hashes are like digital fingerprints and using them allows Safer to programmatically determine if that file has previously been verified as CSAM by NCMEC or other NGOs.

In addition to detecting known CSAM, our classifiers detected more than 1,500,000 files of potential unknown CSAM. Safer’s image and video classifiers use artificial intelligence to predict whether new content is likely to be CSAM and flag it for further review.

Altogether, Safer detected more than 3,800,000 files of known or potential CSAM, a 365% increase in just a year, showing both the accelerating scale of the issue and the power of a unified fight against it.

Safer’s all-time impact

Last year continued to highlight the profound impact of a coordinated approach to eliminating CSAM online. Since 2019, Safer has processed 129.4 billion files from content-hosting platforms, and of those, detected 5 million CSAM files. Each year that we remain focused on this mission helps shape a safer internet for children, digital platforms and their users.

Join our mission

Combating the spread of CSAM requires a united front. Unfortunately, efforts across the tech industry remain inconsistent and those companies who are proactive rely on siloed data. Thorn’s Safer solution is here to change that.

 

Originally published in June 2024 on safer.io



Stay up to date

Want Thorn articles and news delivered directly to your inbox? Subscribe to our newsletter.