Technologies to stop CSAM: Robust Hashing.
Here we look at one of two hashing technologies, robust hashing.
Like binary hashing technology (see article on binary hashing), robust hashing is used to discover online child sexual abuse material on a content level, and to give it a digital signature. However, in contrast to the binary system, robust hashing technology looks at the actual visual content of an image rather than just the binary data of the image file. This technology adds to measures of stopping CSAM from being shared, and from ending up on devices and IT networks.
Robust hashing ensures that the input data produces a hash value that will match any image with the same visual content. Like binary hashes, the hash value cannot be reversed into an image. (See article on binary hashing for more information on how a hash is created).
Whereas two copies of the same image in different file formats will produce completely different binary hashes, robust hashing technology can detect the image even if a slight change has been made to the file, such as resizing or change of file format. This is because the recognition is based on the visual content of the image, rather than the binary file data.
PhotoDNA is a widely used hashing technology. It was developed specifically to detect CSAM, and is used today by law enforcement, NGOs, business and platform providers.
As with binary hashing, PhotoDNA classification is made by law enforcement and a number of NGOs. The hashes are added to databases, which can be used to match and detect known CSAM. Law enforcement use PhotoDNA hashes in the same way that they use binary hashes, and web crawlers use both binary hashes and PhotoDNA hashes when trawling the net.
Unlike binary hashes, robust hashes are more frequently used in environments where detection in real-time is not of critical importance. Social media platforms are one example of this.
NetClean ProActive is a tool that allows business to deploy robust hashing technology for a secondary and wider search in IT environments after binary hashing technology has made a match. The robust search scans nearby files to search for nearly identical material. It is an efficient tool for organizations that want to ensure that their IT environments are free from CSAM.
In our next week’s article, we look at AI which is used to find and categorize CSAM found online.
You might also like...
7 September 2022
Why ‘good enough’ isn’t working
16 August 2022
Technologies to stop CSAM: Keyword Matching.
9 August 2022
Technologies to stop CSAM: Artificial Intelligence.
26 July 2022
Technologies to stop CSAM: Binary Hashing.
19 July 2022
Technologies to stop CSAM: Filter Technologies.
27 April 2021
Grooming online has increased
8 April 2021
Online child sexual abuse has increased by more than 75 percent during the pandemic
6 April 2021
We have seen a definite increase in child sexual abuse crime during the COVID-19 pandemic
9 March 2021
Increased remote working brings new challenges to IT security
28 May 2020
Why is it important to protect mobile phones against child sexual abuse material?
Talk to an expert
Find out more about NetClean Impact and how it fits into your existing IT protection. Our experts will be happy to guide you. Give us a call at +46 31-719 08 00 or follow the links below.