Article 323-3-2 of the French Criminal Code applicability to enforceable on online platforms
Thursday 26 March 2026
Diane Mullenex
Pinsent Masons, London
diane.mullenex@pinsentmasons.com
Background to Article 323-3-2
Article 323-3-2 of the French Criminal Code (Code pénal), makes it a criminal offence for a person providing an online platform service,[1] to either:
- restrict access to their service to people using connection anonymisation techniques; or
- not comply with Article 6.V of Law No 2004-575 of 21 June 2004 (LCEN) (preservation of certain metadata) or with Articles 15, 16 and 18 (transparency reporting obligations, notice and action mechanisms and notification of suspicions of criminal offences) of the Digital Services Act 2022 (DSA) obligations, and knowingly let users transact in products, content or services whose sale/offer/acquisition/possession is manifestly illicit.
The penalty for this offence, and where it is attempted, is seven years imprisonment and a fine of €500,000. Where the offence is committed or attempted as part of a criminal group (bande organisée), the penalties are increased to ten years imprisonment and a fine of one million euro.
The offence was first created by the Law No 2023-22 of 24 January 2023 (LOPMI) in the context of the sale of weapons and narcotics on the dark web. The Law No 2024-449 of 21 May 2024 (SREN) then provided that this offence would apply to online platforms as defined under LCEN from 17 February 2024. Since 15 June 2025, Article 28 of the Law No 2025-532 of 13 June 2025, raised the maximum penalties from five to seven years imprisonment and from a €150,000 to a €500,000 fine and explicitly extended and tied non-compliance to DSA obligations.
Article 323-3-2 was initially subject to little debate as it was seen as being limited to the sale of weapons and narcotics on the dark web and, therefore, a reasonable and proportionate way to respond to such risk. Three years later, and following some arguably onerous amendments, it is subject to an increasing legal debate as the Office of the Public Prosecutor of Paris seeks to rely on this provision for media-related cases.
Key questions on enforceability
In the event of enforcement, online platforms may be able to explore arguments based on a lack of proper notification of implementation measures to the European Commission, incompatibility with EU law principles or legal uncertainty as to scope.
Whilst it is clear (as reported by the European Commission itself) that the LOPMI was never notified to the European Commission by the French authorities, in accordance with the transparency requirements applicable to technical regulations under EU law, what remains open to debate is whether the French legislator had such an obligation. Under Article 5 of Directive (EU) 2015/1535, Member States must ‘immediately communicate to the Commission any draft technical regulation’.[2] The term ‘technical regulation’ is defined very broadly in Article 1(f) of the same directive and includes ‘rules on services’. Article 1(e) further clarifies that ‘rules on services’ encompass general requirements ‘relating to the taking-up and pursuit of service activities within the meaning of point (b), in particular provisions concerning the service provider, the services and the recipient of services’.
Although the investigating judge in these cases rejected this argument, ie, that the French legislator had the obligation to notify the LOPMI to the European Commission and therefore declined to refer a preliminary question to the Court of Justice of the European Union, the very broad scope of what may constitute a ‘technical regulation’ or ‘rule on services’ nonetheless creates a credible basis for challenging the measure for failure to notify the Commission in breach of the Directive (EU) 2015/1535. It is an argument with a genuine prospect of success.
There may also be the possibility to challenge Article 323-3-2 on the basis of non‑conformity, in other words, for infringement of European law, for an unjustified restriction on the free movement of services.
Furthermore, it remains to be seen how the juxtaposition of the two arms of the offence, ie (i) not complying with LCEN or DSA obligations; and (ii) knowingly letting users share manifestly illegal content, will play out. It is easy to see how it would be tempting for this to be used as a catch-all provision where platforms host manifestly illicit content and have separately not complied with DSA/LCEN obligations. However, questions arise as to how not preserving metadata (Article 6.V of LCEN) or the failure to publish an annual moderation report (Article 15 of the DSA) would have a direct connection with the illicit content or knowledge of it.
We also anticipate that litigation may arise with regards to what suffices to prove ‘knowledge’, eg, could this be satisfied by repeat takedown notices? There is a clear acceptance that platforms do not have a general obligation of moderation and so this will require clear evidence of knowledge as opposed to a mere failure of content moderation. Similarly, the scope of what is to be considered as ‘manifestly illicit’ will probably be the subject of litigious evaluation.
It should be noted that whilst criminal investigations and charges are being increasingly brought against online platforms on the basis of Article 323-3-2, a successful prosecution has not yet been seen.
[1] As defined in the Law No 2004-575 of 21 June 2004 (LCEN) art 6.I-4.
[2] Directive (EU) 2015/1535 laying down a procedure for the provision of information in the field of technical regulations and of rules on Information Society services [2015] OJ L241/1.