Why NetClean ProActive

Protect what

Protect your IT environment while safeguarding children. Enhance ethical company values and stop the spread of child sexual abuse material. 

Work computers are used to access and distribute child sexual abuse material

NetClean ProActive is powerful scalable detection software that manages security risks by keeping IT environments free from child sexual abuse material. The application protects and strengthens company values, and helps plot a brighter future for children by working towards the requirements of Agenda 2030.

1 in 500

1 in 500 persons look at child sexual abuse material while at work.


Half of those who consume child sexual abuse material abuse children physically.


of the offenders abuse children that they have a close relationship with.

A unique position to make an impact

As an ethical corporate citizen, you are in a unique position to make an impact. Why? Because businesses and organisations can stop child sexual abuse material in their IT environment and directly limit exposure and the revictimisation that children suffer every time an image is shared. This pro-active work also helps to find individuals who consume child sexual abuse material, thereby helping to remove further harm to children.

NetClean ProActive provides a way to adhere to legislation and policy compliance while helping to stop a serious crime.

By protecting your brand and safeguarding your assets from misuse, you also help ensure that your employees do not suffer the harm of coming across child sexual abuse material that might be stored in your IT environment.

Protection, Social Responsibility and Compliance

Corporate Social Responsibility and Sustainability (Agenda 2030)

Manage Security Risks

Prevent Crime and Protect
Company Assets

Protect and Strengthen
Company Brands

Policy Compliance

Protect Employees


Mitigate negative impact and reinforce your brand by adopting a firm position and helping to create a brighter future for children.

By making sure that the company has acted proactively, done what it can to manage the risks and worked to protect the organisation and its employees, the company brand is also protected. It is a proactive way of limiting the risk of a media crises. It is also an employer branding initiative, which is often seen in a very positive light within the organisation.

There is also evidence that taking action on societal issues strengthens the company brand in the public opinion. A study by Gartner showed that 48% of the general population expect companies to take a public position on social issues regardless of the issue’s relevance to corporate objectives. The research showed that not only do stakeholders respond to companies taking a stance, they also respond in a positive manner three out or four times.


Individuals that handle child sexual abuse material are known to be risk-takers. By visiting inappropriate web pages they can become subject to extortion, while also subjecting the business to an increased risk of malware which exposes businesses to cyber security risks.


A person who views or downloads child sexual abuse material is engaging in risky behaviour and represents a tangible security risk to the company. It is reasonable to assume that someone who is willing to engage in this type of risky behaviour might break other laws or flaunt company policies.


Employees visiting unregulated websites and media, incur the risk that their visit can be traced back to the organisation. This in turn increases the risk of attacks such as DDOS-type cyber-attacks, spam and other threats. There is also a risk of malware when someone downloads illicit or unwanted material. This is also the case with unverified USB sticks.


Individuals who consume child sexual abuse material are vulnerable to both threats and blackmail. This signals a big security risk, especially in cases where the employee has a prominent place in the business or handles sensitive material that they can be blackmailed to divulge.


A risk, especially for employees in IT roles, is that they will be subjected to the illicit material left by other individuals within the organisation on company devices and networks. Being subjected to child sexual abuse material can cause trauma, especially if repeated several times. If the organisation fails to actively detect the material, the only way of discovering it is by other employees in the organisation. By protecting the organisation against child sexual abuse material, the employees of that organisation are also safeguarded from the risk of being subjected to this traumatising material.

Make a difference.
Create a brighter future
for children

Making a difference can be easy. Child sexual abuse material is not just images – it is documented abuse and crime scene photos. Businesses can help stop this material from spreading and limit revictimization, which is crucial to the continued welfare of exploited children.

Frequently asked questions

Which technologies are used to stop child sexual abuse material?

Law enforcement tools aside, there are a number of technologies available that are used today to address the problem; such as crawlers, blocking technologies, filter technologies, artificial intelligence, robust hashing technologies and binary hashing technologies. These technologies all have strengths and limitations, depending on the context in which they are used.

More information on the different technologies and how they work is gathered here.

Why are web filters not enough?

Filter technologies are primarily used to manage security threats such as business intelligence, service disruptions, ransomware, fishing etc. While they can also be used to block websites known to contain child sexual abuse material, they have several weaknesses in the context of blocking child sexual abuse material. Firstly, they are only as effective as the intelligence put into the solutions – the lists of domains or URL:s known to contain harmful material. Keeping these lists up to date requires a lot of work and continuous updates and as the primary focus of these solutions are on other types of threats, child sexual abuse material unfortunately comes far down the list. The other weakness is that they only block known URL:s or domain names, thus missing all other ways of distributing the material (such as P2P, darknet, social media platforms or when someone uses a USB stick to access the material). Therefore, although using web filters is helpful, it is not enough and does not protect the organisation and its assets against child sexual abuse material.

How can hashing technology be used to detect child sexual abuse material?

In addition to using filter solutions, businesses can install NetClean ProActive designed to specifically detect child sexual abuse material on work computers. The software works similarly to an antivirus programme, but instead detects when child sexual abuse material is handled on a work computer. To identify the images, hashing technology is used. When law enforcement investigates child sexual abuse cases, they produce a hash, a unique ‘digital fingerprint’, of each image. These hashes are then added to a database, which allows the software to match against images handled on the work computer. This means that NetClean ProActive only detects child sexual abuse material that has been classified by law enforcement. At detection, an alert is sent to designated persons within the organisation (business or public sector organisation) who handle the incident and report to police.

Digital Fingerprints - how does it work?

NetClean collaborates with law enforcement authorities who classify images and videos as illegal. In this process, a digital fingerprint is calculated from every image and video, and these digital fingerprints are added to our signature database. The database makes it possible for ProActive to detect the actual illegal content, instead of blocking an entire URL or specific file names. This means that as soon as someone downloads, opens, moves or in any way handles a suspicious file, NetClean ProActive sends a notification, matching against our database also ensures that nothing but child sexual abuse material is detected and reported, which means that ProActive will not detect family pictures from the beach etc.

Will NetClean ProActive for example detect family pictures from the beach?

No, ProActive only match against material classified as child sexual abuse by law enforcement, this ensures that nothing, but child sexual abuse material is detected and reported. ProActive will not detect family pictures from the beach etc.

What information do you receive from NetClean ProActive in case of an incident?

In case of an incident (when a suspicious image is detected), the designated person(s) receives an Incident Report.

The report can contain;

  • type of agent
  • computer name
  • DNS name
  • IP-number

MAC addresses, in which domain and in which country the computer is located

What happens when an incident occurs?

When an incident occurs, the designated person(s) within the organisation is notified via a text message or an e-mail. This means that those responsible do not need to continually supervise the system; alerts indicate when an incident has occurred. It is important that the NMS runs continuously in order to receive incident reports and send notifications. In the event of an incident, ProActive will only send information about it to the designated contacts within the company, no information leaves the company automatically.

(It is important that the NMS runs continuously in order to receive incident reports and send notifications.)