What is not allowed in the real world, must also be a no-go in the virtual world
EESC calls on Commission to fight violent and discriminatory online content more effectively
The EESC acknowledges the Commission’s Communication Tackling Illegal Content Online – Towards and enhanced responsibility of online platforms as a first and useful step but is not satisfied with its scope. It therefore calls on the Commission to establish programmes and effective measures to provide a stable and consistent legal framework for the efficient removal of illegal content.
The EESC also considers the cases of illegal content mentioned in the Communication as too narrow and proposes to review and catalogue them in order to achieve greater inclusiveness rather than limit them to the explicitly mentioned ones (terrorism, xenophobic speech, child sexual abuse material). For instance, malicious defamation and dissemination of material that violates human dignity or contributes to gender violence should also be included.
“Spreading illegal content must be nipped in the bud”, said Bernardo Hernández Bataller when presenting his opinion on Illegal content/online platforms, adopted at the EESC plenary on 14 March 2018.
This is why the EESC also proposes to pay special attention to the development, processing and dissemination of purely informative content which appears – at a first glance – legal but is hiding illegal.
The focus should also be extended to anything that relates to mega data and the benefits that online platforms obtain by exploiting this data.
Illegal content is a complex and cross-cutting issue that needs to be tackled form a range of perspectives:
“Firstly, it is important to assess the impact of illegal online content and secondly, we need to harmonize the way it is dealt with in the legal framework of the Member States. When we talk about the adoption of criteria and measures, the starting point must be that what is forbidden in the real world must also be forbidden in the virtual world. We need to put a stop to illegal and inhuman content,” explained the rapporteur.
“Publishing illegal content, hate speech or incitement to terrorism is not a peccadillo. In order to avoid, combat and remove such material, we need to strengthen the measures. This is also important in order to protect minors”, said Mr Bataller.
Given the impact that digital platforms already have on our daily life and the risks they pose, it is essential to have clear and harmonised legislation all over Europe. “We need a healthy mix between regulatory measures and self-regulatory measures”, emphasised Mr Bataller.
The EESC believes that online platforms themselves should provide users with the tools to display fake news and thus make other users aware that the veracity of the content has been put into question.
Due to the potential consequences of too much concentration of economic power, the economic growth of some digital platforms also deserves increased vigilance.
Last but not least, the EESC believes that for a more coherent approach the E-Commerce Directive, the Unfair Commercial Practices Directive and the Directive on Misleading and Comparative Advertising should be revised.