
THE CURRENT STATE OF CSAM DETECTION
Besides Peak's AI, only two CSAM detection technologies exist.
Let's take a closer look.
01
Hash Matching
RECALL: 18.5%
02
Minor + Nudity Detection
ACCURACY: 5.2%
compare specs
minor + nudity detection
hash matching
artificial intelligence
trained on real csam
FINDS NET NEW CSAM
FINDS GEN AI CSAM
no user reports needed
can detect csam with no face visible
CONSISTENT ACCURACY ACROsS DEMOGRAPHICS
optimized for minimal false positives
PERFORMANCE
5.2%
18.5%
98.5%
SCROLL
HASH MATCHING
RECALL: 18.5%
Hash Matching is a method to identify records based off their unique hash values, which are digital fingerprints assigned to data like images or videos.

Hash databases contain files that have been manually reviewed and confirmed as CSAM, each assigned a unique hash. When that file is reuploaded, hashing tools match it to its corresponding hash in the database.

WHAT IS IT?
WHY DOESN'T IT WORK?
Hashing tools can only automatically identify CSAM that already exists in their hash database; they cannot match hash values to new CSAM.

The Dilemma:
CSAM can't be detected if it's not in the database,
but it can't be added to the database unless it's found.

Even if a file is in the database, the exact same file must be reuploaded in order to be auto-detected; any alterations, like cropping, would change its hash value.

The rise of generative AI CSAM exacerbates this issue, as these files aren't in any hash database. When reported, they flood the review queue with artificial images, obscuring cases of real children in danger.

Companies that can't train on CSAM but advertise CSAM detection AI models are using models that detect minors combined with models that detect nudity.

WHAT IS IT?
MINOR + NUDITY DETECTION
ACCURACY: 5.2%
WHY DOESN'T IT WORK?
While these models can detect general patterns like nudity or age estimation, they are not trained on CSAM itself, which is crucial for distinguishing between benign and harmful content.

Combining nudity detection with minor detection in AI models is a critical logic leap when it comes to accurately identifying CSAM. Machine Learning models rely on vast amounts of data to learn and make accurate predictions.

Without the right training data, these models fail to make the nuanced distinctions required to effectively identify and flag CSAM, resulting in a near-zero accuracy rate.

© 2024 by Take A Peak, Inc.
compare specs
minor + nudity detection
hash matching
artificial intelligence
trained on real csam
FINDS NET NEW CSAM
FINDS GEN AI CSAM
no user reports needed
can detect csam with no face visible
CONSISTENT ACCURACY ACROsS DEMOGRAPHICS
optimized for minimal false positives
PERFORMANCE
5.2%
18.5%
98.5%
SCROLL
