Digital Services Act & Digital Markets Act: Europe regulates big digital platforms
Tuesday 9 January 2024
Verónica Volman
RCTZZ Abogados, Buenos Aires, Argentina
volman@rctzz.com.ar
Damián Navarro
RCTZZ Abogados, Buenos Aires, Argentina
navarro@rctzz.com.ar
Lisandro Frene
RCTZZ Abogados, Buenos Aires, Argentina
frene@rctzz.com.ar
With the premise that ‘what is illegal in the real world must be illegal online’ the EU, together with the European Commission, the European Parliament and the European Council, started discussing, during 2020, two projects that aim to regulate digital platforms: the Digital Services Act (DSA) and the Digital Markets Act (DMA). Both Acts, which formed part of the regulatory package known as the ‘Digital Services Package’, came into force in November 2022 and are already projecting effects in the daily operation of digital platforms in the EU.
These Acts regulate the responsibility of intermediaries in relation to advertisements, illicit content and disinformation (DSA). They further establish clear standards for platforms with great market power from an antitrust perspective (DMA). The DSA and DMA complement pre-existing national and regional regulations related to consumer rights, data protection and competition etc.
The speed at which this regulatory package turned into active regulation in the EU reflects the regional consensus and interest about the importance of regulating these stakeholders with great market power. This interest is not casual, but rather comes directly from recent and growing records of sanctions on big tech companies proven to have infringed antitrust and/or consumer user rights.
To list a few examples, in 2021: the European General Court (EGC) confirmed a fine of €2.42 billion for Google in the Google Shopping Case, where the judges concluded that the company favoured its own price comparing service over competing services through a more favourable positioning of its products in the Google search engine; and in 2023, the Spanish antitrust authority imposed a fine of €194.1 million upon Amazon and Apple for restricting competition in the market of the online sale of electronic products, through private agreements between the parties. These are a few salient cases, without mentioning the imposition of civil, criminal and administrative fines to big digital platforms in different countries in the EU for lack of action in the publication of false or illicit content (as defined in the corresponding regulations).
DSA
The DSA aims to create a safe digital space, where the fundamental rights of digital users are protected and preserved.
Beyond the fact that the DSA expressly frees platforms from the duty of actively monitoring the publication of illegal content, it does establish specific obligations for the platforms that constitute a conditional exemption from liability. Namely, the platforms will not be responsible under the DSA, to the extent that they fully comply with the duties and standards imposed by this Act. If it is found that platforms do not comply with their obligations, they may face fines of up to 6 per cent of the global annual turnover of the previous year, 1 per cent for providing incorrect information and corrective fines of up to 5 per cent of the average daily turnover.
It is to be remarked that although the DSA applies to all intermediary service providers, it demands additional compliance standards and reinforced due diligence obligations from very large online platforms (VLOPs) and very large online search engines (VLOSEs) – those with more than 45 million monthly active users (such as YouTube, Facebook, Booking, TikTok, etc).
Below, we highlight some of the main obligations the DSA imposes:
- VLOPs and VLOSEs must be transparent regarding their recommendation systems and algorithms, informing users of the criteria with which they rank contents, goods and services offered through the platforms;
- users must be able to choose to opt out of targeted advertising;
- it is forbidden for platforms to publish advertisements targeted at minors and those based on users’ sensitive data (according to GDPR standards), with no exception;
- intermediary service providers must exercise due diligence and ‘know your customer’ processes with respect to people and businesses that offer their goods and services or advertise in the platform;
- VLOPs and VLOSEs must execute an annual risk assessment and take actions to mitigate them – they must also hire an independent risk auditor; and
- VLOPs and VLOSEs must implement a complaints management mechanism within the platforms, as well as foresee the possibility of extrajudicial and judicial means to solve claims.
This mechanism aims to protect freedom of speech (ie, who publishes, advertises or offers their goods and services online), in its conflict with the right of consumers in matters of privacy, consumer law and protection from illicit content.
The DSA established that by February 2024, each EU member state must assign a ‘Digital Services Coordinator’ who will be responsible for enforcement of the law at a national level in the case of small and medium platforms, while the European Commission is the enforcement authority with respect to VLOPs and VLOSEs.
DMA
The DMA has a more specific range of action as its purpose is to regulate big platforms from an antitrust perspective. The novelty of this regulation lies in that it is an ex-ante regulation at a regional (EU) level, the infringement of which is per-se punishable, different from regulations that punish infringements ex-post, and in cases where a damage to the protected legal and economic interest is effectively proven.
To that effect, the DMA establishes quantitative criteria to designate the so-called ‘gatekeepers’; those platforms that offer ‘core platform services’ and act as a gateway between companies and consumers, having enough market power to allow or prevent competition and innovation in the digital ecosystem. The DMA presumes that a platform is a gatekeeper when it has (i) an annual turnover in the EU higher than €7.5 billion and provides at least one core platform service in at least three EU member states; (ii) provides a core platform service to more than 45 million users in the EU – 10 per cent of the EU population – and to more than 10,000 businesses in the EU; or (iii) it has fallen under condition (ii) in the last three years.
Nevertheless, these criteria are merely presumptive, as the European Commission may understand that a digital platform is a gatekeeper even if it does not fall under all the conditions mentioned above or may understand that a company that falls under all the criteria should not be considered a gatekeeper under the DMA.
On 6 September 2023, the European Commission designated 22 gatekeepers (LinkedIn, Instagram, WhatsApp, YouTube, and Google Chrome, among others), pertaining to six big holdings (Alphabet, Amazon, Apple, ByteDance, Meta and Microsoft). At the same time, the European Commission is conducting market investigations to analyse if further designations correspond (Microsoft: Bing, Edge and Microsoft Advertising, as well as Apple: iMessage), while some other companies have refuted their designation as gatekeepers, with pending conclusions.
The gatekeepers, as the unique subjects reached by the DMA, have up until March 2024 to comply with the provisions of the law. Any infringement may entail fines of up to 10 per cent of their annual worldwide turnover and up to 20 per cent in cases of recidivism, as well as the imposition of structural and conduct remedies determined by the European Commission – the enforcement authority of the DMA.
Within the obligations (dos and don’ts) we list some examples:
- allow continuous and immediate data portability to users;
- allow interoperability between services provided by users and those provided by gatekeepers;
- allow platform users to access the data related to the performance of their advertisements and content;
- refrain from treating more favourably goods and services offered by the gatekeeper in its own platform (Google shopping case);
- refrain from hindering users to uninstall any preinstalled app if they wish to do so (Android case); and
- refrain from combining data of users from different platforms without their consent.
Finally, as some of these provisions rely heavily on technical arrangements that may differ between gatekeepers, the mean to reach compliance is not only one, and the European Commission has declared to be open to a dialogical process with each gatekeeper in order to reach the DMA’s goals.
Prospect in Argentina, Brazil and the United States
On this side of the ocean, the regulation of the intermediaries’ responsibility is also subject of current interest and discussion.
In the US, there are several bills whose purpose is to regulate the behaviour and responsibility of digital platforms from different perspectives.
In parallel, one of the most relevant trials[8] in the internet era is a case in which the Department of Justice (DOJ) and 12 state prosecutors sued Google, accusing it of creating an illegal network of private agreements with Samsung cell-phone manufacturers and internet providers, thus monopolising the search engines market.
Brazil is discussing a bill that establishes obligations of transparency, systemic risk assessments and the adoption of mitigation measures in the hands of digital platforms to avoid illicit practices. The bill also allocates the civil responsibility of platforms for the damaging content published by third parties in certain cases. Brazil has also issued a resolution stating that digital platforms are not neutral agents regarding the content that flows through them and that the interference in the information flow is one of the pillars of the business model and their profit.
In Argentina, a bill has been submitted for approval in 2020 (still pending), which addresses the Regulation of Intermediary Digital Online Services for the Defense of Competition and Consumers.[10] This bill, not coincidentally, contains some similar provisions to the DMA and DSA.
In recent years, Argentina’s justice has faced different judicial criteria regarding the reach of the intermediaries’ responsibilities, whether in light of Consumer Protection Act No 24.240—understanding the platform as a part of the commercialisation chain (section 40)—or in light of the subjective collective responsibility in the National Civil and Commercial Code.
Privacy and competition: WhatsApp’s ex-officio investigation
Lastly, it is worth highlighting a recent case that involves the analysis of data privacy from an antitrust perspective in Argentina.
In the frame of an ex-officio investigation led by the National Antitrust Authority (CNDC) about the conditions imposed to users by WhatsApp Inc, through Resolution No 492/2021 issued by the National Secretary of Interior Commerce (NSIC), a protection measure against WhatsApp was determined, ordering the Argentinian subsidiaries of Facebook and WhatsApp to refrain from implementing the updates to the terms of service and privacy policy of the messaging service for 180 days, or until the CNDC’s investigation concludes, whichever happens first; and to refrain from exchanging users’ data in the terms of the conditions update, even in the cases in which WhatsApp users had accepted those new terms; and finally, an obligation to inform all users of the complete text of Resolution No 492/2021.
The Resolution was appealed before the National Civil and Commercial Chamber of Appeal, which on 22 September 2022, understood that the NSIC decision was justified under section 44 of the Antitrust Act No 27.422, in order to avoid potential and hypothetical damages to the general economic interest protected by the antitrust regulation.
In that sense, the Chamber of Appeal considered that it was possible that an anticompetitive practice could have a place, which could be difficult or impossible to repair in the future if committed and would irremediably affect consumers that had provided their personal data without limitation. The Chamber of Appeal also highlighted that the cautionary measure provided in the Antitrust Act meant enlarging available alternatives for users, consumers and potential competitors, by attempting to preserve their rights during the processing of the case, as well as preventing the consummation of an illicit or anticompetitive conduct on the effectiveness of the state action.
As pointed out by the Chamber of Appeal, the NSCI indicated that the conduct of the company regarding the exchange and treatment of collected information within the WhatsApp holding (used for selling to advertisers) could reasonably entail two identified anticompetitive conducts: (i) the exclusionary abuse of a dominant position, as by the treatment and exchange of information obtained from users of all platforms; Facebook and WhatsApp LLC would obtain a competitive advantage in reproduction or contestability by their competitors in online advertisements, instant messaging and social network markets; and (ii) an exploitative abuse of the dominant position of Facebook and WhatsApp platform users through the obtention and treatment of their private data provided in those WhatsApp, Facebook and Instagram accounts, without a real possibility of limiting the treatment of that data.
Meanwhile, the administrative investigation before CNDC for presumptive anticompetitive conduct is still happening, with the authority observing that WhatsApp would be combining data from their users with Facebook, abusing its power position and hindering competition in the digital market.
In this way, the use of private data by big digital platforms is being considered by antitrust authorities and will have a great impact in antitrust and consumer protection regulations.
Next steps
Two-sided big digital platforms include search engines, social network services, operating systems and online intermediation systems, among many other examples. Their sophistication results in a complex framework of relationships between consumers, businesses and competitors.
The recent growth of digital platforms and digital services providers has been accompanied by a variety of claims from consumers, workers and competitors, that have been partially addressed by specific regulatory measures in Argentina, for example: Argentina’s Central Bank Regulation of Payment Services Providers (PSP), treatment of platform workers claims (for example, Rappi or Pedidos Ya), regulation of transport services (as Cabify or Uber) or the designation of Information and Communications Technology (ICT) services and access to telecommunications networks as a public service through Decree No 690/20.
Notwithstanding these attempts, additional difficulties arise as the regulation of freedom of speech when it confronts hate speech and fake news, the unequal treatment of streaming services and traditional media or the implementation and limits of net neutrality principle.
Antitrust enforcement by national authorities on an ex-post manner, as well as certain – and sometimes unpredictable – interpretations of consumer protection regulations do not seem to be enough to cope with the regulatory challenges generated by digital platforms considering their network effects, global scale and vertical and horizontal integration.
In conclusion, it is expected that a higher level of regulation in these matters is required, with the assimilation of the recent regulatory trends enacted in the EU and the US.