While DSA is under discussion there could be legal uncertainty with respect to content
The new digital regulatory package the EU announced at the end of 2020 was meant to regulate
comprehensively the digital services and platforms and the relationships they establish. However, the
Digital Services Act and the Digital Single Market Act have to be put in conformity with other European
instruments and all of them systematized. Problems have already stemmed regarding the detection of
child sexual abuse online. A new amendment introduced to the European Electronic Communication
Code redefines the term “electronic communications services” to include messaging services, such as
Facebook Messenger. The EU e-Privacy Directive, which relies on the Electronic Communication Code’s
definition of “electronic communication services” forbids operators of such services to process users’
content or traffic data, even if the purpose of processing is to detect child sexual abuse. The instrument
ensures the protection of private life and the confidentiality of communications and personal data in the
electronic communications sector. In contrast to the GDPR, the e-Privacy Directive does not provide a
legal basis for the voluntary processing of content or traffic data for the purpose of detecting child sexual
abuse.
The EU Council was aware of the gap but was unsuccessful in adopting a temporary exception allowing
the blocking of child abuse content before the amendment came into force on December 21, 2020. Then
the Council stated that it would propose legislation to such illegal content by the second quarter of 2021.
“Protecting children and fighting against child sexual abuse in any form is an absolute priority for the EU.
The valuable activities carried out on a voluntary basis online to detect and remove this criminal material
must be able to continue without interruption”, Peter Altmaier, German Federal Minister for Economic
Affairs and Energy, President of the Council said. Under the Council position, the temporary regulation
will apply until 31 December 2025, or until an earlier date when permanent legislation is passed and
repeals the temporary measure.
Under these circumstances, Facebook announced that it would stop applying its child abuse detection tools
on content communicated in the EU via the company’s messaging app due to the legal uncertainty
created. Other companies such as Microsoft, Google, and LinkedIn voiced their position that they would
not cease their voluntary efforts to detect and remove child abuse content following the responsible
the approach they had taken on board.
In a statement, Microsoft underlines that the company’s commitment to digital safety is unwavering. “For
many years, we have employed a range of detection methods to protect children from sexual abuse and
exploitation online, to prevent terrorists and violent extremists from exploiting online services, and to
protect users from spam and malware. We do this work by balancing important considerations of privacy,
safety, security, and other fundamental rights, consistent with the law.” As an example of responsible
policy Microsoft points to the unique safety challenges in the Gaming environment and stresses that it has
met them through “innovative tools to address harassment, grooming, and other harms.”
The valid solution will come up when DSA enters into force and all instruments are put in compliance.
Compiled by Media 21 Foundation from: https://www.consilium.europa.eu (Link)