Gong’s comments on the Digital Sevrice Act

Gong shared its perspectives and suggestions on some of the articles of the Chapter III of the Digital Service Act:

Article 19 (Trusted Flaggers)

Having in mind the differences between some member states in relation to the rule of law, it would be beneficial to address the possible need for awarding the trusted flaggers’ status by an EU body and/or other member states’ DSCs. Namely, the lessons learned from the Croatia Case at Wikipedia call for active prevention of abuse of the upcoming EU regulation.

Article 21 (Notification of suspicions of criminal offences)

In addition to the “threat to the life or safety of persons”, another criteria for promptly informing the law enforcement or judicial authorities of the Member State(s) should be considered for cases that may be stemming from relevant election legislation (political party financing, elections, referenda and other mechanisms of public participation regulated on EU and member state levels).

Articles 24 (Online advertising transparency), 30 (Additional online advertising transparency) and Article 36 (Codes of conduct for online advertising)

Full and timely (real-time) transparency of all ads should also be aligned with regulation of elections, referenda and other mechanisms of public participation regulated on EU and member state levels. In Croatia that means details on sources of funding, the amounts paid and other details, including in pre-election periods (preliminary reports 2 weeks prior to the election date) and periods between elections stemming from the regulation on transparency of political party financing (on member states levels, but also having in mind European elections). Although these regulations are currently focused on political parties/candidates and the State Election Commission, the information on political ads cannot be independently fact-checked or verified if the data from platforms providing for the ads is not available.

Regarding the Codes of conduct for online advertising, it is of utmost importance to ensure that effective transmission of information also includes the specific circumstances and vulnerabilities stemming from election periods where the need for information is instant.

Article 26 (Risk assessment)

Self-assessment of the platforms should also include systemic risks against democratic political systems (especially participation mechanisms), with due respect to minority communities and other legitimate communal interests that might suffer irreparable harm, ensuring appropriate space for the voiceless.

Article 33 (Transparency reporting obligations for very large online platforms)

To enable full transparency to the European public sphere, in relation to the decisions of the platforms on what should not be disclosed to the public as confidential information, the Digital Services Coordinators should have the power to conduct the public interest and proportionality test and then make the decision disclosure/publication. This decision should be able to be tested before EU courts.

Article 35 (Codes of conduct)

In cases of identified significant systemic risk, the commission should be required to invite also civil society and minority organisations and other interested parties that represent legitimate communal interests.  

For sensible monitoring of the implementation of the Codes of conduct, especially in cases related to human rights violations, all relevant data should be made available for independent audits, most importantly by the regulators, courts and researchers.

Lastly, let us also offer some of our previous research stemming from election observation that might be useful for your future consideration: