Technologies to stop child sexual abuse material
In a short series of articles, we will look at some of the technologies that are used to stop child sexual abuse material (CSAM) today. Some are used in NetClean’s products, and some are used by law enforcement and NGOs to find and remove material online.
CSAM is illegal and more prevalent than one would like to think. It can be caught by firewalls and here we look at the different filter technologies that are available. Read on to see our articles on binary and robust hashing; AI and keyword matching over the next weeks.
Many different types of filtering options exist, but as a basic rule, all filter technologies look at web traffic that passes in and out from IT environments to decide what can get through. These technologies work at different layers of a network, determining how specific the filtering option is.
The technologies look for suspect behavior, surf patterns, links, known “bad” domains or specific patterns. Large companies are likely to have millions of DNS requests every second, and looking in detail at all that traffic requires enormous data power. Therefore, companies install different solutions that work in layers and look at different parts of the traffic.
Checks website requests against a database of prohibited addresses/domains and either allows the requested webpage to be displayed or refuses the request.
Is a more sophisticated and granular technology that can be used to block access to specific websites or parts of websites known to contain malware. The most common ways that a filtering solution stops a page from being loaded onto a user’s device are:
- Blacklists: Lists of websites known to contain malware and viruses. When a request matches a blacklisted site, the request is denied.
- Category filtering: Filters block certain groups of websites, such as porn or gambling sites.
- Content filtering: Is used to block viruses, e-mail attachments, adverts etc. Website and email
requests are allowed, however, the item is inspected at the proxy server to determine if it contains anything meeting configured criteria.
- Keyword filtering: Blocks access to specific content by keyword, without necessarily blocking access to an entire category of websites.
Filter technologies are only as effective as the intelligence put into the solutions – such as lists of keywords, domains or URLs known to contain harmful material. These solutions focus on security threats such as business intelligence, service disruptions, ransomware, phishing, etc.
CSAM can be picked up by filters, but as matching is often not a priority other solutions are needed, like specific programs, and crawlers to find the material before it appears inside a business. One such tool is NetClean ProTective, which uses URL filtering to specifically block sites that contain CSAM. Its detection rate is high because it uses matching lists that are constantly updated with data from reliable sources such as law enforcement.
Read about hashing technologies, AI and keyword matching in our next articles.
Others also read
Artificial Intelligence, AI, is increasingly being used in child sexual abuse investigations, by helping to recognise, categorise and triage material. In this article we look closer at the technology and how it works.
Discover robust hashing and how it is used to fingerprint and detect online child sexual abuse material on a content level.
Learn more about binary hashing and how it is used to fingerprint and discover child sexual abuse material on content level.
Child sexual abuse material is illegal and more prevalent than one would like to think. It can be caught by firewalls and in this article, we look at the different filter technologies that are available.