Article – Parliament's ideas for tackling harmful or illegal content online
MEPs want the final decision on the legality of user-generated content to be taken by an independent judiciary, not private commercial entities.
In addition to illegal online content being removed, where it is criminal, it should be followed up by law enforcement and the judiciary. The Commission should also consider obliging online platforms to report serious crime to the competent authority.
Ways to tackle harmful content
To address the problem of harmful content such as hate speech or disinformation, MEPs propose increasing transparency obligations for platforms as well as raising media literacy among users.
Parliament noted that one reason disinformation spreads so fast is because some platforms’ business models favour showing sensational and clickable content to users to increase profits. To tackle the negative effects of this practice, members want transparency on the monetisation policies of online platforms.
More choice for users over what they see online
MEPs want to give users more control over the content they see and the possibility to opt out of content curation altogether.
They are calling for stricter regulation of targeted advertising in favour of less intrusive, contextualised advertising that is based on what a user is looking at in a given moment and not on their browsing history.
Going further, they want the Commission to look into more options for regulating targeted advertising, including an eventual ban.
The Commission is expected to come up with a proposal for the Digital Service Act by the end of 2020.