Good faith in the digital age: an overview of accountability and liability under the DSA
Tuesday 10 March 2026
Marta Boura
Abreu Advogados, Lisbon
marta.boura@abreuadvogados.com
The accountability of online platforms for the dissemination of illegal content is widely regarded as one of the main virtues of the European Union’s Digital Services Act (DSA). This Regulation provides for swift removal of unlawful content and it is functionally oriented towards the assessment and mitigation of systemic risks. However, enforcing rules on illegal content remains challenging as: (1) the notion of ‘illegal content’ is inherently broad and often difficult to assess; (2) platforms may be exposed to political pressure in the qualification and removal of such content; and (3) its dissemination frequently occurs at a speed which outpaces traditional enforcement mechanisms. It is precisely in this context that we encounter one of the most acute and complex contemporary threats to the determination and effective exercise of individual freedoms and rights: disinformation and hate speech. Over the past year alone, international organisations have reported a 58 per cent increase in online hate speech according to the Areto Hate Speech Index. National data confirms this trend. For instance, in Portugal, crimes related to hate speech increased by approximately 38 per cent in 2024, making the issue a central matter of current political debate. These figures illustrate the difficulty of crafting effective responses to widespread and escalating behaviour with harmful consequences for fundamental rights.
Although good faith constitutes an undisputed legal principle, its meaning and normative implications may vary significantly across legal traditions. While in certain jurisdictions, good faith has materialised into a specific legal command, giving rise to specific duties of conduct not only during contract performance but also prior and beyond the contractual relationship; in other jurisdictions good faith occupies a mere axiological or interpretative role. In private law, this controversy has shaped and influenced international legal instruments at multiple levels. The UNIDROIT Principles and the United Nations Convention on Contracts for the International Sale of Goods (CISG) illustrate the challenges of reaching a common understanding on the scope of good faith, particularly in international contract law, where the need to ensure uniformity calls for an autonomous, internationally accepted concept. Moreover, the inherent vagueness of good faith requires densification which, in the context of the CISG, typically occurs through the recognition of ancillary or secondary duties, such as the duty of cooperation or the duty of disclosure.
The debate has recently surfaced in the context of the EU’s digital regulatory framework. Although neither the AI Act nor the DSA, for example, specifically set out good faith as guiding principle, it may be considered a key solution underpinning transparency and accountability obligations. Beyond these dimensions, good faith is progressively invoked as an anchor of trust and liability in the digital ecosystem, which may be shaped into a new concept.
Within the DSA, the principle of enforced self-regulation and the imperative of enhancing trust in online interactions can be identified as core manifestations of digital good faith. This discussion was deepened in the context of the YouTube v Cyando judgement in 2021.[1] As highlighted by Rónán Riordan,[2] this decision provided clarification on the application of host liability rules as the Court of Justice of the European Union (CJEU) reaffirmed the need to determine whether the host maintains a neutral role to benefit from liability exemptions, therefore clarifying that a provider voluntarily implementing technological measures to detect copyright-infringing content does not, per se, qualify as an active role. Consequently, and contrary to previous decisions (such as Google v Louis Vuitton)[3] it was established that it is important to create incentives for providers to take proactive measures to detect and remove illegal content.
DSA addresses such concerns, providing good faith as an ‘incentive shield’ for service providers, as established in the ‘Good Samaritan’ clause under Article 7. Accordingly, if a provider identified any illegal content and actively attempts to address it through voluntary investigations, the DSA ensures that such conduct, if in good faith, cannot be used against the provider. Good faith is, therefore, acting as an incentive tool, protecting the providers and encouraging voluntary measures aimed at ensuring safety online. Although this represents a step forward on the way to mitigate the dissemination of such illegal content, there is room for further actions as the elimination of disincentives for voluntary actions has not proved sufficiently effective. In this context, good faith can operate as a diligence-based duty requiring providers to act towards prevention even in the absence of any prior identification of illegal content. This duty may be articulated through two core commands: the adoption of measures to prevent dissemination of unlawful material and the proactive implementation of mitigation mechanisms.
Although Article 35 of the DSA sets out specific mitigation duties, these are only triggered following the identification of system risks. The scope of capture is therefore of a different nature, as it is assessed based on the effectiveness of such measures to address those specific risks. In this sense, good faith under a diligence duty would precede, complement and surpass the mitigation framework already established: it would guide providers’ conduct before, during and after the risk assessment processes, filling interpretative gaps and shaping responsible behaviours beyond the duties listed under DSA. As a duty of loyalty and fairness, good faith could expand the DSA’s logic of accountability by imposing a broader obligation to act diligently not only to shield providers from liability but to fundamentally bind them to proactive conduct in the digital ecosystem.
[2] Rónán Riordan, 'A Case Study of Judicial-Legislative Interactions via the Lens of the DSA's Host Liability Rules', European Papers — A Journal on Law and Integration, 10 (1), 2025, pp. 259-297.