Brazilian legal framework on automated decision-making
Sunday 9 June 2024
Camila Mariotto
Lahorgue Advogadas Associadas, Rio de Janeiro
cmariotto@lahorgue.adv.br
Automated decisions have been increasingly used by organisations for different purposes in a myriad of sectors. Decisions that were previously made by humans, such as those related to hiring and dismissing employees; assessing insurance risks and credit scores; diagnosing medical conditions; and moderating social media content, are now being made by artificial intelligence (AI) systems through the large-scale processing of personal data. Although the rise of AI has undeniably brought many benefits to society, primarily through increased productivity and efficiency, the widespread use of these systems also presents significant legal and ethical challenges to regulators worldwide.
Article 20 of the Law No 13.709/2018 (LGPD or the ‘Brazilian General Data Protection Law’) regulates automated decisions. Although the LGPD is known as a GDPR-alike law,[1] the two legal frameworks do not address this type of personal data processing in the same manner. This article aims to present the most striking differences between both legislations, and briefly comment on how this topic has been addressed in Brazil.
Legal bases
Although the legal bases under the LGPD for the processing of personal data and special categories of personal data are almost the same to those outlined in the GDPR, the application of the law in Brazil regarding automated decision-making slightly differs as follows.
Automated decisions in Brazil are authorised as long as they are based on one of the legal bases provided for in the LGPD and they comply with the rights and safeguards outlined in the law. On the other hand, the GDPR sets forth a general rule which says that ‘the data subject shall have the right not to be subject to a decision based solely on automated processing […] which produces legal effects concerning him or her or similarly significantly affects him or her.’ However, this provision is not applicable if the automated decision (1) is necessary for entering into, or performance of, a contract between the data subject and a data controller; (2) is authorised by Union or Member State law; or (3) is based on the data subject's explicit consent. In cases of processing of special categories of personal data, the European legislator outlines only two possible legal bases: data subject consent or when the processing is necessary for reasons of substantial public interest, on the basis of EU or Member State law (Article 22).
Right to review
The LGPD grants data subjects the right to request a review of decisions made solely through automated processing of their personal data, which affects their interests. This includes decisions that impact their personal, professional, consumer and credit profiles, as well as other aspects of their personality (Article 20). However, unlike the GDPR, this provision does not establish that this review will be conducted with a human intervention. The first LGPD draft established that decision reviews must be carried out by a human being. However, in the same year the law was approved, a legislative amendment removed this requirement. This amendment has been criticised by Brazilian scholars since they believe that ‘human intervention’ is necessary to address the deficiencies of algorithmic decisions and ensure that they are explainable.
Right to information
With regards to the right to information, the LGPD sets forth that ‘whenever requested to do so, the controller shall provide clear and adequate information regarding the criteria and procedures used for an automated decision, subject to commercial and industrial secrecy’ (Article 20, §1). If the controller does not provide the information requested, based on commercial and industrial secrecy, the Brazilian authority may carry out an audit to verify discriminatory aspects in this processing of personal data (Article 20, §2).
Although Article 20, §1, which specifically regulates automated decisions, establishes that the data subject has the right to information upon request, other principles and rights outlined by the LGPD apply to all types of personal data processing, including automated decisions. In this context, it is important to remember that Article 9 grants data subjects the right to easily access information about any processing of their data.
In this regard, it is worth mentioning that last February, the Brazilian Data Protection Authority (ANPD) launched a public consultation regarding data subject’s right (‘Public Consultation’). A report with final conclusions has not yet been published. Even tough, in the introduction to a particular question of the consultation, the authority underscored the critical role of active transparency in personal data processing. According to the ANPD, transparency serves not only as a legal mandate but also as a cornerstone for fostering trust between data subjects and controllers. It showcases a dedication to integrity and legal compliance, promotes accountability and helps nurture a culture of privacy respect.
Final remarks
Regulating automated decisions has become a global concern for authorities and lawmakers. In Brazil, this topic is also being discussed in Congress through some bills of law, including one aiming at regulating artificial intelligence. Some questions posed by the ANPD in its Public Consultation, such as what defines an automated decision and the criteria for determining when a data subject’s interests are significantly affected, have also been debated by the EU authorities. Although the ANPD has been often following the steps of the EU authorities when issuing guidelines and orientations, the differences in the regulation of automated decisions in the LGPD and the GDPR make it crucial to closely monitor the Brazilian agency's next steps. This is essential not only for the self-determination of data subjects but also for society to grasp whether the processing of personal data through automated decisions is in line with the LGPD principles, including the principle of non-discrimination.
Note
[1] European General Data Protection Regulation.