Brazil Data Protection Law – Litigation in the Context of Employment

  • Employers operating in Brazil will likely see an uptick in litigation involving claims filed under the country’s Data Protection Law (LGPD).
  • The Brazilian National Data Protection Agency, the entity charged with enforcing the LGPD, recently issued new guidance on this law.

The Brazilian Data Protection Law (LGPD) in effect since 2020 is starting to show its effects in the litigation landscape.

Although the Brazilian National Data Protection Agency (ANPD) has been fairly active since it started operating in 2021 to promote and regulate the LGPD, there are many points of the law the ANPD has not yet addressed and the number of audits and fines imposed is still very small for a country with more than 200 million potential data subjects.

From January 2021 to December 2023, the ANPD received 825 data breach incident reports from companies, the majority related to hacking and ransomware, and 2,950 complaints from data subjects relating to their frustrated exercise of rights or whistleblowing of violations of the LGPD. Currently, the ANPD is auditing eight companies from the private sector and only one company has been fined.

Based on such data, many companies, including several U.S. and other-nationality parent companies, have not been very concerned about a potential risk of being audited or fined by the agency. However, many such companies are unaware of a parallel risk that is brewing at a much faster pace: litigation.

Brazil is well known for its extremely high rate of litigation. In any given year, the labor courts receive on average three million new cases relating to labor and employment disputes. Most cases are “traditional” labor and employment disputes, but the number of cases where the LGPD is invoked is growing and we predict that in a short period of time claims relating to data privacy will become a common new cause of action among these three million annual cases.

Case studies

A study released by the Instituto Brasileiro de Ensino, Desenvolvimento e Pesquisa (IDP) (a research institute) and Jusbrasil (a legal tech company) in April 2023 identified 1,789 judicial cases that mentioned the LGPD from September 2021 to September 2022 in cases in the civil, criminal and labor courts and tribunals. Out of the 1,789 cases identified, in 189 cases decided by the labor courts, the applicability of the LGPD was debated in a significant way or was the central issue of the case.

A few areas are worth highlighting here.

Geolocation

In several labor complaints, requests for the disclosure of worker geolocation data for the purpose of proving compliance with working hours were observed. Employers have required the production of digital proof of geolocation through applications installed on employees' cell phones to indicate where the employees were actually located when they were supposed to be working on their shifts. Some decisions reject the employer’s requests based on respect for the worker's privacy and informational self-determination, to the detriment of the company's right to a broad defense through this means of evidence (Cases #s 0020329-81.2020.5.04.0006 and 0011155- 59.2021.5.03.0000).

Background Check

One particular case identified in the Study is significant for employers in Brazil and their parent companies from abroad. The case touches on several sensitive areas for employers and the judgment award—although still unknown—may reach six figures or more. Key points of this case are discussed below.

  1. LGPD may apply to old pending cases

The case1 is from 2012, long before the LGPD was enacted in 2020. Subsection I Specialized in Individual Disputes (SDI-I) of the Superior Labor Court (TST), the highest labor tribunal in Brazil, ruled in early 2022 that the LGPD should apply to this case because the underlying facts continue to occur.

  1. The case is against a vendor (but a client is not safe either)

The case is a type of class action called “public civil action” filed by the Labor Prosecutor’s Office (MPT) of the 10th Region against a risk management company, seeking an injunction prohibiting the company from using a database scraped from various publicly available sources and collective moral damages (a type of punitive damages) on behalf of all employees of and candidates for employment with the risk management company’s clients that sought risk analysis to hire cargo drivers.

The risk management company used publicly accessible credit score/history data and criminal records of job applicants for the transport of cargo to create a risk assessment database to sell to clients and insurance companies insuring the cargo/transport.

  1. Scraping publicly available data

The MPT’s two core questions to the court were whether it was permissible to create this database based on personal credit information obtained from a legal credit score database, and whether the use of this newly created risk assessment could give rise to potentially discriminatory treatment.

The MPT lost its claim in all judicial levels until it got to the SDI-I, which reversed the lower courts’ rulings and awarded the MPT a permanent injunction, ordering the company to (a) stop using the database it created, providing and/or searching for information relating to credit restrictions of job candidates, subject to a fine of $2,000 per candidate for each breach, and (b) pay collective moral damages to be calculated in the execution (damages) phase.

  1. Illegal activity and discriminatory treatment

Although the lower courts’ reasoning was that the risk management company could not be responsible in this case because its business was legal and if any discriminatory practice would take place, such actions were not taken by the risk management company but by the clients that bought the data, the SDI-I disagreed.

The SDI-I established that the publicly available credit report database the company relied upon is intended to protect credit to be granted by banks, individuals, and commercial associations. It should not be used to assess the driver's employability or the probability that they will steal transported goods. It pointed out that the Superior Labor Court has uniformly established that credit information cannot be required from employees and job candidates, and thus, using or having the data used for any purpose other than protecting the provision of credit by banks, unless authorized by law, is illegal under the LGPD.

It further reasoned that the risk management company’s database violated the LGPD’s principles. Specifically, it violated the principle of purpose because it utilizes personal data that was collected for one purpose (that is, a credit history and score for the purpose of credit protection) and processed it for another entirely different purpose (creating a risk assessment database for employers and insurance companies) which the data subject did not freely and knowingly agree to.

The rapporteur of the case2 reported to the court his views, which were included in the court’s decision. Among his comments, he stated the following, which may be used in future disputes:

  • That disclosing this information without specific consent may violate the constitutional right of privacy, and not just the LGPD.
  • That a careless reading of the LGPD might lead one to think that the company's use of the data was permissible as "data made manifestly public by the data subject." But there is a distinction between "publicly accessible data," and "data made public by the data subject."
  • The violation here was more serious because the data at issue falls within the category of "sensitive personal data."
  1. Processing Sensitive Personal Data

This last point is particularly troublesome because the LGPD’s definition of sensitive personal data does not include criminal records and credit score/history. The labor courts have previously expended the protected categories’ list under the law against discrimination by refereeing to it as just illustrative. Here, the rapporteur of the case also took the position that the list of sensitive data under the LGPD is not absolute and that anything that can be used to discriminate against individuals is sensitive data, such as the person’s criminal and credit history. Of course, this position may still be challenged but it creates a possibly damaging precedent and thus a new issue for companies that run and use background checks.

The processing of sensitive personal data requires additional steps and restricts the legal basis for the processing. For instance, “legitimate interest” of the company is not a legal basis for the processing of sensitive personal data.

  1. Use of algorithmics

In addition, the rapporteur further discussed the use of algorithmics (and would likely mention AI if the decision was more recent) in the use of such a risk assessment database, as merely compounding the dangers of discrimination and privacy invasion addressed by the LGPD and the court. Using algorithmics permitted the possibility of creating discriminatory standards without providing the job applicants and employees means to exercise their rights to have revised their personal and professional profiles.

New ANPD actions

On January 30, 2024, the ANPD launched the Glossary of Personal Data Protection and Privacy. The document contains the ANPD's official position on the meaning of the main concepts, terms and expressions used in personal data protection legislation and in the ANPD's documents. The Glossary brings together information previously dispersed through several documents with cross-references, facilitating access for both citizens and professionals in the field.

Even more significant than the Glossary is the new Guide on Legitimate Interest, published on February 2, 2024.

The Guide defines parameters on the use of legitimate interest as the legal basis for the processing of personal data. It addresses the concept of “legitimate interest,” fundamental rights and legitimate expectations of data subjects, prohibition of using “legitimate interest” as a basis for processing of sensitive personal data, and balancing interests between the controllers/third parties and the data subjects. It also provides a number of examples to clarify the ANPD’s position and understanding of such concepts and a simplified balancing test model for companies to use in their assessments.

According to the Guide, the processing of personal data based on legitimate interest presupposes the identification and mitigation of risks to the fundamental rights and freedoms of data subjects. In this sense, a balancing test must be carried out, so controllers can better assess (and demonstrate if prompted) whether the impacts are proportionate and compatible with fundamental rights and determine which safeguards should be adopted to minimize the risks.

Employers will also need to take steps when relying on “legitimate interest” as a basis for collecting and using personal data, and be mindful of the potential risks of being sued for moral damages when the legitimate interests fail the fundamental rights balance test.

The Guide provides an example that has been an issue in various countries, including the United States: the installation of tracking activities software to measure productivity. The Guide suggests that data collection, including recording images and everything typed by the employee, through software, interferes excessively and disproportionately with the fundamental rights and freedoms of the data subjects and goes against their legitimate expectations, even if this activity may have been informed in advance and be included in the privacy policy. The data collection, in that scenario, goes far beyond what is necessary to meet the intended objectives (i.e., measure performance), so that it would not be reasonable to expect such data collection to be carried out by the employer.

Final Considerations

The Superior Labor Court in a recent article summarized the matter we address here: The employer bears the responsibility of implementing the LGPD. Despite existing gaps in certain areas, it remains crucial to prioritize data protection, privacy, and other fundamental rights of workers. Key actions include conducting a Data Protection Impact Assessment, formulating an access policy, establishing an information security system to safeguard data processing (including biometric data collection and storage), defining data sharing practices, creating a retention and disposal policy, and conducting awareness sessions and training for employees. These measures are essential to mitigate risks and uphold data subjects’ rights. Non-compliance may result in fines and other administrative penalties imposed by the ANPD. There are still controversial issues that await a positioning of the ANPD or the Courts to ensure the rights of data subjects in accordance with the principles of Labor Law and the LGPD.

Although many multinational companies now have separate and dedicated data privacy teams, in-house counsel and HR personnel with responsibilities that include Brazil operations must remain engaged and actively participate in data protection matters to ensure that the company is and will continue to be in compliance with the data protection law to minimize the risk of complaints from workers based on the violation of their fundamental rights.


See Footnotes

1 TST-E-RR-933-49.2012.5.10.0001, decision issued by the Justices of Subsection I Specialized in Individual Disputes of the Superior Labor Court on December 16, 2021 and published on February 25, 2022.

2 Justice Cláudio Mascarenhas Brandão.

Information contained in this publication is intended for informational purposes only and does not constitute legal advice or opinion, nor is it a substitute for the professional judgment of an attorney.