Colombia’s law 2489 of 2025 and its draft implementing decree: a new frontier in children’s online safety regulation in Latin America
Thursday 26 March 2026
Sergio Michelsen
Brigard Urrutia, Bogotá
smichelsen@bu.com.co
Andrés Fernández de Castro
Brigard Urrutia, Bogotá
afernandezdecastro@bu.com.co
Luis Felipe García
Brigard Urrutia, Bogotá
lgarcia@bu.com.co
Introduction
On 17 July 2025, the Colombian Congress enacted Law 2489 of 2025, a statute establishing provisions for the development of safe and healthy digital environments for children and adolescents in the country. The statute mandates the development of a public policy to articulate efforts among government entities, the private sector, families and civil society, with the purpose of promoting risk prevention, healthy technology habits and the guarantee of children’s digital rights. Article 14 of the Law requires the Government to issue implementing regulations within six months following its entry into force. In response, the Ministry of Information and Communications Technologies (MinTIC) released a draft presidential decree in December 2025 for comment, proposing the addition of a new title to Decree 1078 of 2015, the sector’s single regulatory decree.
This article provides an overview of both the Law and the draft decree, situates them within the broader international regulatory landscape and identifies certain questions of legal and practical significance that observers and practitioners may wish to consider.
Other jurisdictions
Colombia’s initiative does not emerge in a vacuum. Over the past decade, a global wave of regulation has targeted the protection of minors in digital environments. As documented in one recent survey, at least 26 regulations in 19 jurisdictions have been adopted, reflecting four principal regulatory approaches: content-based, design-based, transparency-based and due process-based.[1]
The UK’s Online Safety Act 2023 imposes obligations on digital platforms relating to the protection of minors, including children’s risk assessments and reasonable and proportionate mitigation measures, with enforcement by the Office of Communications (OFCOM). Complementarily, the UK Age Appropriate Design Code (‘Children’s Code’), adopted in 2021 by the Information Commissioner’s Office (ICO), establishes 15 standards for age-appropriate design, including maximum privacy settings by default, data minimisation and restrictions on persuasive design mechanisms (nudging) directed at children. The ICO’s evaluation work has reported emerging evidence of changes in children-focused privacy and safety settings across major services.
In the EU, the Digital Services Act (DSA) requires very large online platforms to evaluate and mitigate systemic risks affecting minors, including exposure to illegal or harmful content and the misuse of their data by algorithmic or recommender systems. The DSA prohibits platforms from displaying advertising based on profiling when they know with sufficient certainty that the user is a minor. Sanctions for non-compliance can reach up to six per cent of the company’s global revenues. In 2025, the European Commission issued its Guidelines on the Protection of Minors Online, recommending measures such as private accounts by default for children, restricted messaging from unknown users and restrictions on behavioural profiling for commercial purposes.
In the US, the Children’s Online Privacy Protection Act (COPPA), in force since 1998 and updated in 2025, establishes specific obligations for services directed at children under 13, including verifiable parental consent for data collection and restrictions on behavioural advertising.
Meanwhile, Australia’s Online Safety Amendment (Social Media Minimum Age) Act 2024, which entered into force in December 2025, requires age restricted social media platforms to take reasonable steps to prevent users under the age of 16 from creating or keeping accounts, with potential fines of up to AUD 50m for non-compliant platforms.
Overview of the Law
These international developments form the backdrop against which Colombia’s Law 2489 of 2025 and its draft decree must be understood.
Law 2489 establishes a framework of co-responsibility among the State, families, the private sector, civil society and educational communities. Its guiding principles include the best interests of the child, comprehensive protection, the progressive evolution of children’s capacities and proportionality. Article 3 sets out a proportionality standard, requiring that protective measures be evidence based, effective and balanced, designed to maximise opportunities for children in the digital environment while safeguarding their freedom of expression, avoiding excessive punishment and not unduly restricting digital services or innovation.
The Law assigns specific responsibilities to MinTIC, the Communications Regulatory Commission (CRC), the Ministry of Education, the Colombian Institute for Family Welfare, the Attorney General’s Office and the National Police, among others. It creates a National Committee on Technology, Childhood and Adolescence (National Committee), mandates a repository of best practices in digital safety and establishes an integrated monitoring and evaluation system. The Law draws expressly on the UN Convention on the Rights of the Child and, in particular, the UN Committee on the Rights of the Child’s General Comment No 25 on children’s rights in relation to the digital environment, which calls on States parties to protect children from harmful content while ensuring that businesses develop guidelines enabling children to safely access diverse content in accordance with their evolving capacities.
The draft implementing decree
The draft decree proposes a set of concrete obligations for, arguably, digital platforms and internet services accessible to minors. At its core, the decree introduces the concept of secure design by default, requiring that platforms incorporate, from the conception and development stage, initial configurations that automatically apply high levels of security and rights protection in the absence of any active user intervention. The decree requires platforms to automatically configure the most restrictive and protective settings with respect to privacy, exposure to harmful, illegal or inappropriate content, interaction with third parties and algorithmic recommendation and amplification mechanisms.
Closely linked to this concept is a minors’ mode, a default configuration applied to users identified as minors through reasonable age determination mechanisms. The draft materials further signal that age assurance mechanisms should respect data minimisation and prioritise anonymous estimation techniques, with oversight of these processes referred to the Superintendence of Industry and Commerce (SIC) given its competence over personal data processing. The decree also requires graduated, age-based protection, with differentiated levels of supervision and autonomy, transparent interfaces adapted to each age group and gradual transitions between stages.
On the content side, the decree establishes a content classification system based on four categories: content suitable for all ages, for users over seven, for users over 13 and for users over 18, to be applied according to criteria of age, risk (content, conduct, contact and consumer risks), and thematic considerations such as sex, violence, addictive products and services and pornography.
Alongside classification, the decree imposes content labelling obligations: when a user is in minors’ mode, platforms must provide clear, prominent, accessible and prior labelling indicating the age range, risk description, presence of sensitive themes and recommendations for adult supervision, including for live transmissions.
The decree further requires platforms to submit annual compliance reports to MinTIC, endorsed by an external audit, subject to review by the National Committee.
While the objectives of the Law and the draft decree are broadly consistent with international trends, several questions of legal and practical significance deserve careful consideration.
A threshold question concerns the scope of regulatory competence. Law 2489 directs MinTIC, in coordination with the CRC and within the framework of existing sector competences, to advance the regulatory implementation of the statute. An open issue is how far that mandate can reach, in practice, when applied to over-the-top services or broader digital platform services that are not traditionally regulated as telecommunications networks or services. Whether the imposition of obligations such as content classification, auditing or operational controls on such platforms falls within the applicable regulatory mandate is a matter that may merit further analysis.
Proportionality presents a related concern. Article 3 of the Law itself establishes a clear proportionality standard, yet certain provisions of the draft decree, such as the requirement to apply the most restrictive and protective settings by default, could, in practice, lead to overly broad restrictions that may affect children’s freedom of expression and access to information. Finding the appropriate balance between protection and the full exercise of children’s digital rights remains a challenge shared by all jurisdictions engaged in this area of regulation.
Age assurance and the activation of minors’ mode may involve processing personal data, potentially including minors’ data, which will need to be aligned with Colombia’s data protection framework. The draft materials themselves emphasise data minimisation and contemplate SIC oversight of age assurance processes. Finally, the absence of a clear definition of certain key terms in the draft decree, such as illegal content for the purposes of content moderation, may create uncertainty for platforms seeking to comply in good faith.
Conclusion
Colombia’s Law 2489 of 2025 and its draft implementing decree represent one of the most comprehensive attempts in Latin America to establish a regulatory framework for the protection of children in digital environments. The initiative draws on international standards such as the UN Convention on the Rights of the Child, the DSA and the UK Age Appropriate Design Code. At the same time, the draft decree raises questions concerning regulatory authority, proportionality, data privacy and legal certainty that will deserve close attention as the process moves forward. For practitioners and regulators elsewhere, Colombia’s experience will offer a valuable case study in the effort to reconcile children’s online protection with fundamental rights and innovation.
[1] Marian Olaozola Rosenblat, Ayushi Agrawal and Isaac Yap, ‘Online Safety Regulations Around the World: The State of Play and the Way Forward – A Resource Guide’ (NYU Stern Center for Business and Human Rights, 2025).