Tech support

Government backs EU proposals to scan personal messages for child sexual abuse images

The government has signaled its strong support for sweeping EU proposals that would require tech companies to scan people’s private online communications, including encrypted messages, for child sex abuse images and grooming behaviors.

As indicated previously in the Irish Examiner, The plans, drafted by the European Commission, were welcomed by children’s groups, but opposed by civil rights campaigners.

Now the government has said it backs the proposals, saying it is ‘right’ for member states to adopt a ‘compulsory’ detection and reporting of child pornography (CSAM) system.

In a briefing note on the draft EU regulation, the Justice Department said the regulation stems from an EU strategy on combating child sexual abuse adopted in July 2020.

He said the proposal introduces a legal structure in which online messaging service providers will be responsible for assessing the risks of CSAM circulating on their sites and for “detecting, reporting and removing” such abuse.

“This represents a major change from the current system, where some online service providers voluntarily detect and report CSAM, but are not required to do so,” the department said.

The note indicates that the regulation further proposes to create a European center to prevent and counter sexual abuse of children and to support the implementation of the regulation, in particular by creating databases of CSAM indicators, which the services will have to use to fulfill their obligations.

“Ireland’s initial view is very supportive of what this proposal aims to achieve,” the department said.

“Child sexual abuse and the creation and spread of CSAM are extremely serious crimes, which are on the rise. It is right that we move to a mandatory detection and reporting system.”

He said that given the cross-border nature of online services, the government believes there is a need, both from a criminal justice and internal market perspective, for action to be taken on a European basis.

He said the country recognizes that there are “genuine privacy concerns” around the implications of some of the measures.

The department said the proposal places broad responsibilities on national authorities and added: “Given that the European headquarters of many large technology companies are located in Ireland, this will mean a particular responsibility for Ireland, in terms of national implementation. The ministry said there was a “clear public good” behind the proposal. This, he said, is especially true for victims, getting the CSAM representing them removed from the internet.

“Concerns have been raised that the public may be impacted by actions that online service providers may be required to take, based on a detection order, with respect to the CSAM hosted on their platforms,” ​​he said.

The department welcomed the recognition by the European Commission that a “fair balance must be struck” between the protection of children and their fundamental rights and the fundamental rights of service users.

He said the ministry has begun initial consultations with interested government agencies and expects to hear the views of other parties, including victims and industry, during negotiations.