Today the European Commission published reports by Facebook, Google and Twitter covering the progress made in January 2019 on their commitments to fight disinformation. These three online platforms are signatories of the Code of Practice against disinformation and have been asked to report monthly on their actions ahead of the European Parliament elections in May 2019.
More specifically, the Commission asked to receive detailed information to monitor progress on the scrutiny of ad placement, transparency of political advertising, closure of fake accounts and marking systems for automated bots. Vice-President for the Digital Single Market Andrus Ansip, Commissioner for Justice, Consumers and Gender Equality Věra Jourová, Commissioner for the Security Union Julian King, and Commissioner for the Digital Economy and Society Mariya Gabriel said in a joint statement:
“The online platforms, which signed the Code of Practice, are rolling out their policies in Europe to support the integrity of elections. This includes better scrutiny of advertisement placements, transparency tools for political advertising, and measures to identify and block inauthentic behaviour on their services.
However, we need to see more progress on the commitments made by online platforms to fight disinformation. Platforms have not provided enough details showing that new policies and tools are being deployed in a timely manner and with sufficient resources across all EU Member States. The reports provide too little information on the actual results of the measures already taken.
Finally, the platforms have failed to identify specific benchmarks that would enable the tracking and measurement of progress in the EU. The quality of the information provided varies from one signatory of the Code to another depending on the commitment areas covered by each report. This clearly shows that there is room for improvement for all signatories.
The electoral campaigns ahead of the European elections will start in earnest in March. We encourage the platforms to accelerate their efforts, as we are concerned by the situation. We urge Facebook, Google and Twitter to do more across all Member States to help ensure the integrity of the European Parliament elections in May 2019.
We also encourage platforms to strengthen their cooperation with fact-checkers and academic researchers to detect disinformation campaigns and make fact-checked content more visible and widespread.”
Main outcomes of the signatories’ reports:
- Facebook has not reported on results of the activities undertaken in January with respect to scrutiny of ad placements. It had earlier announced that a pan-EU archive for political and issue advertising will be available in March 2019. The report provides an update on cases of interference from third countries in EU Member States, but does not report on the number of fake accounts removed due to malicious activities targeting specifically the European Union.
- Google provided data on actions taken during January to improve scrutiny of ad placements in the EU, divided per Member State. However, the metrics supplied are not specific enough and do not clarify the extent to which the actions were taken to address disinformation or for other reasons (e.g. misleading advertising). Google published a new policy for ‘election ads’ on 29 January, and will start publishing a Political Ads Transparency Report as soon as advertisers begin to run such ads. Google has not provided evidence of concrete implementation of its policies on integrity of services for the month of January.
- Twitter did not provide any metrics on its commitments to improve the scrutiny of ad placements. On political ads transparency, contrary to what was announced in the implementation report in January, Twitter postponed the decision until the February report. On integrity of services, Twitter added five new account sets, comprising numerous accounts in third countries, to its Archive of Potential Foreign Operations, which are publicly available and searchable, but did not report on metrics to measure progress.
Next steps
Today’s reports cover measures taken by online companies in January 2019. The next monthly report, covering the activities done in February, will be published in March 2019. This will allow the Commission to verify that effective policies to ensure integrity of the electoral processes are in place before the European elections in May 2019.
By the end of 2019, the Commission will carry out a comprehensive assessment of the Code’s initial 12-month period. Should the results prove unsatisfactory, the Commission may propose further actions, including of a regulatory nature.
Background
The monitoring of the Code of Practice is part of the Action Plan against disinformation that the European Union adopted last December to build up capabilities and strengthen cooperation between Member States and EU institutions to proactively address the threats posed by disinformation.
The reporting signatories committed to the Code of Practice in October 2018 on a voluntary basis. In January 2019 the European Commission published the first reports submitted by signatories of the Code of Practice against disinformation. The Code aims at achieving the objectives set out by the Commission’s Communication presented in April 2018 by setting a wide range of commitments articulated around five areas:
- Disrupt advertising revenue for accounts and websites misrepresenting information and provide advertisers with adequate safety tools and information about websites purveying disinformation.
- Enable public disclosure of political advertising and make effort towards disclosing issue-based advertising.
- Have a clear and publicly available policy on identity and online bots and take measures to close fake accounts.
- Offer information and tools to help people make informed decisions, and facilitate access to diverse perspectives about topics of public interest, while giving prominence to reliable sources.
- Provide privacy-compliant access to data to researchers to track and better understand the spread and impact of disinformation.
Between January and May 2019, the Commission is carrying out a targeted Monthly Intermediate Monitoring of the platform signatories’ actions to implement Code commitments that are the most relevant and urgent to ensure the integrity of elections. Namely: scrutiny of ad placements (Commitment 1); political and issue-based advertising (Commitments 2 to 4); and integrity of services (Commitments 5 & 6).
The Code of Practice also goes hand-in-hand with the Recommendation included in the election package announced by President Juncker in its 2018 State of the Union Address to ensure free, fair and secure European Parliament’s elections. The measures include greater transparency in online political advertisements and the possibility to impose sanctions for the illegal use of personal data to deliberately influence the outcome of the European elections. As a result, Member States have set up a national election cooperation network of relevant authorities – such as electoral, cybersecurity, data protection and law enforcement authorities – and appointed a contact point to participate in a European-level election cooperation network. The first meeting of this network took place on 21 January 2019 and a second one on 27 February 2019.
More information
Reports of the online platforms
Press release: A Europe that Protects: The EU steps up action against disinformation
Factsheet: Action plan against disinformation
Follow this news feed: EU