14 April 2021 00:00
As coordinator of Provenance, the FuJo Institute in DCU has provided feedback to the EU consultation on the revised Code of Practice on Disinformation. The FuJo submission calls for the development of common definitions of key terms, stronger oversight mechanisms to monitor compliance, and a framework for data-sharing to facilitate researchers’ access to platform data.
The EU Code of Practice on Disinformation was launched in 2018 as a self-regulatory mechanism to counteract online disinformation. The voluntary signatories included tech companies – Facebook, Google, Mozilla and Twitter – as well as advertising associations. Among other objectives, Code signatories agreed to develop safeguards against disinformation; enhance scrutiny of online advertisements; reduce the visibility of disinformation; and increase content transparency.
The Code arose from a deliberation between the major platforms, advertisers, and a “Sounding Board” composed of stakeholders from media, academia, and civil society. At the time, the Sounding Board criticised the lack of measurable objectives and enforcement instruments within the Code. In essence, Code signatories were free to report their data according to their own definitions of the problem.
These criticisms were reinforced by subsequent reviews of the Code’s implementation. A 2019 assessment by the European Regulators Group for Audiovisual Media Services (ERGA) found that the platforms’ transparency provisions lacked “the required detail” and were inconsistent. FuJo undertook research for two Irish reports – ElectCheck and CodeCheck – which informed ERGA’s assessments.
In September 2020, the European Commission’s assessment of the Code noted the need for commonly-shared definitions, more precise commitments tied to key performance indicators (KPIs) and appropriate monitoring mechanisms. In addition, the Commission recognised the need to secure greater access to data for independent evaluations.
FuJo’s submission addressed the following points:
Co-regulation: The Code’s self-regulatory approach in 2018 may have been a logical first step, but a co-regulatory model is now necessary. In particular, there is a need for redress mechanisms in cases of non-compliance. A co-regulatory approach can engage a wide set of stakeholders including Member States, platforms, civil society, and experts with relevant expertise. As the digital environment changes quickly, the overall framework for the Code should be iterative in facilitating adaptations as required.
Common definitions: Many reviews have noted that the lack of common definitions prevents a shared understanding and inevitably results in inconsistent applications of the Code. Common definitions should be formulated by the relevant stakeholders and defined at EU level to ensure consistency across Member States and across individual platforms. Terms that require clear definitions include: disinformation, political advertising, issue-based advertising and information operations. Regarding advertising, attention should be given to the importance of broad definitions (e.g. interest group adverts published at any time) rather than narrow definitions (e.g. political party adverts published before an election).
Common transparency standards: Platforms should be obliged to adopt common standards in their labelling of content, ad libraries, and transparency reports. Without common standards, the accessibility of the platforms’ transparency provisions is greatly diminished. Relevant stakeholders should be involved in the design of these standards to ensure accessibility and the desired levels of detail.
Data access: Despite widespread concern about online disinformation, there are still major gaps in knowledge about the spread of false claims and the effectiveness of countermeasures. The information currently provided by the platforms is not sufficient to address these gaps in knowledge. A data-sharing mechanism that is GDPR compliant is needed to secure access to relevant, disaggregated data for researchers. Relevant stakeholders should be involved in the design of the data-sharing mechanism to prevent the technical and governance challenges encountered by previous initiatives such as Social Science One.
Oversight and accountability: Code signatories should be subject to independent audits to assess compliance with the Code, to verify the platform’s self-reported data, and to assess the effectiveness of the measures adopted to counteract disinformation. Meaningful audits will require access to data that is commercially sensitive such as information about the operation of recommendation algorithms and content removals. The data-sharing mechanism should attend to the needs of data-access for auditing purposes.
The EU consultation is open until April 29th.
© 2019 Provenance | The PROVENANCE Action Management Team, Dublin City University, Glasnevin, Dublin 9, Ireland | Portal