Legal considerations

MANY STAKEHOLDERS,
MANY DRIVERS

There is no external legal requirement on companies or organisations to protect their IT environment from child sexual abuse material (CSAM). There are however a number of legal aspects that companies need to consider, and a number of laws that affect an implementation of software to detect CSAM.

Possession of CSAM is a crime defined by the Criminal Code in most countries, and as such should be reported to the police. Reporting to the police is mandatory in some countries.

The fact that CSAM is illegal helps companies drive the necessary internal processes when child sexual abuse is detected in the business IT environment. Committing a criminal act is in most cases a legal reason for termination of employment. That CSAM is illegal also affects company processes in regards to how incidents and the illegal material itself need to be handled. In countries where CSAM is not illegal companies need to rely solely on their internal policies.

Labour laws

When employers have lost confidence in an employee and want to take disciplinary action, or terminate employment, labour laws govern how the case should be handled. These differ greatly between countries. Generally, it is easier to act on illegal actions, such as consumption or downloading of CSAM, compared to inappropriate behaviour, such as viewing pornography, which the company also might want to take action on.

Countries, regions and industries

The legal framework in countries may affect the implementation of software to detect CSAM in IT environments. In most countries such an implementation is straight forward, however in some countries labour unions have to be consulted, and in a limited number of countries approval by every employee is required.

Although not specific to CSAM, in public sector and government organisations, which are funded by taxpayers, there are external demands on preventing criminal behaviour, and there are also requirements to put processes in place that makes it possible to log, follow up and trace activities.

For operators of essential services and providers of digital services, there are cybersecurity acts generated by the European Union, and the US and Asia, such as the NIS directive, which is an EU wide legislation on cybersecurity. Another example are the legal demands placed on the financial sector to work to stop criminal networks and financial crimes. While none of these regulations are aimed at CSAM, taking action on this issue can support compliance to these requirements.

GDPR

GDPR is legislation on data protection and privacy in the European Union and the European Economic Area and needs to be considered when installing detection software. One directive is that incident data stored in software that detects CSAM needs to be deleted within certain time frames.

“GDPR doesn’t present any legal obstacles to detecting child sexual abuse material on work computers, however it needs to be considered in our processes.”

“GDPR put demands on how we handle sensitive information. We therefore have a very limited group of people involved in incidents where employees are suspected of consuming child sexual abuse material.”

Privacy

Irrespective of GDPR, personal privacy and the perceived risk to personal privacy is an issue that is often part of the discussion around detection software.

“Some people internally were worried about the consequences and saw detection of CSAM as invasion of privacy. However, the only thing this software does it to prevent equipment from being used for illegal activities. How can that be invasion of privacy?”

“We discussed the issue with the unions, but they didn’t have any reservations. Their and our conclusion was that the software does not invade personal privacy, as it only reacts to material classified as illegal by law enforcement. However, we decided that we needed to be open and transparent about the implementation.”

Comment to insight 7, 8 & 9

More results from the NetClean Report 2020