Technologies to stop child sexual abuse material
In this short series of articles, we look at some of the technologies that are used to stop child sexual abuse material (CSAM) today. Some are used in NetClean’s products, and some are used by law enforcement and NGOs to find and remove material online.
Here we look at the second of two hashing technologies, robust hashing. You can read the first article on binary hashing here. Follow this series to read about AI and keyword matching over the next few weeks.
Like binary hashing technology (see previous article), robust hashing is used to discover online child sexual abuse material on a content level, and to give it a digital signature. However, in contrast to the binary system, robust hashing technology looks at the actual visual content of an image rather than just the binary data of the image file. This technology adds to measures of stopping CSAM from being shared, and from ending up on devices and IT networks.
Robust hashing ensures that the input data produces a hash value that will match any image with the same visual content. Like binary hashes, the hash value cannot be reversed into an image. (See the previous article for more information on how a hash is created).
Whereas two copies of the same image in different file formats will produce completely different binary hashes, robust hashing technology can detect the image even if a slight change has been made to the file, such as resizing or change of file format. This is because the recognition is based on the visual content of the image, rather than the binary file data.
PhotoDNA is a widely used hashing technology. It was developed specifically to detect CSAM, and is used today by law enforcement, NGOs, business and platform providers.
As with binary hashing, PhotoDNA classification is made by law enforcement and a number of NGOs. The hashes are added to databases, which can be used to match and detect known CSAM. Law enforcement use PhotoDNA hashes in the same way that they use binary hashes, and web crawlers use both binary hashes and PhotoDNA hashes when trawling the net.
Unlike binary hashes, robust hashes are more frequently used in environments where detection in real-time is not of critical importance. Social media platforms are one example of this.
NetClean ProActive is a tool that allows business to deploy robust hashing technology for a secondary and wider search in IT environments after binary hashing technology has made a match. The robust search scans nearby files to search for nearly identical material. It is an efficient tool for organizations that want to ensure that their IT environments are free from CSAM.
In our next week’s article, we look at AI which is used to find and categorize CSAM found online.
Others also read
Artificial Intelligence, AI, is increasingly being used in child sexual abuse investigations, by helping to recognise, categorise and triage material. In this article we look closer at the technology and how it works.
Discover robust hashing and how it is used to fingerprint and detect online child sexual abuse material on a content level.
Learn more about binary hashing and how it is used to fingerprint and discover child sexual abuse material on content level.
Child sexual abuse material is illegal and more prevalent than one would like to think. It can be caught by firewalls and in this article, we look at the different filter technologies that are available.