Global Data Protection Compliance FY24/02

A Quarterly Roundup of Key Developments Across Continents
July 11, 2024
Written by Agnieszka Hinz

In an era defined by rapid technological advancements, CATTS Data Protection's Quarterly News for FY24/02 provides a crucial overview of pivotal developments in data protection across the European Union, Poland, The Netherlands, Spain, Mexico, India, and Brazil. As society becomes increasingly reliant on digital platforms, understanding the profound impact of regulations and decisions in the realm of data protection is paramount.

The evolving landscape, as discussed in the recent news, underscores the ongoing efforts to enhance data protection and privacy across various regions. In the EU, the European Data Protection Board (EDPB) emphasized the need for real choice in 'consent or pay' models, ensuring users are not compelled to consent to personal data processing for behavioral advertising. The introduction of new resources and languages for small businesses, as well as the focus on responsible use of facial recognition technologies at airports, reflect the EDPB's commitment to GDPR compliance and individual rights. These developments highlight the global significance of adapting data protection frameworks to protect citizens' rights and maintain trust in the digital age.

Data protection in the EU

  • EDPB: ‘Consent or Pay’ models should offer real choice

During its latest plenary, the EDPB adopted an Opinion following an Art. 64(2) GDPR request by the Dutch, Norwegian & Hamburg Data Protection Authorities (DPA). The Opinion addresses the validity of consent to process personal data for the purposes of behavioral advertising in the context of ‘consent or pay’ models deployed by large online platforms. 

As regards ‘consent or pay’ models implemented by large online platforms, the EDPB considers that, in most cases, it will not be possible for them to comply with the requirements for valid consent, if they confront users only with a choice between consenting to processing of personal data for behavioral advertising purposes and paying a fee.

The EDPB considers that offering only a paid alternative to services which involve the processing of personal data for behavioral advertising purposes should not be the default way forward for controllers. When developing alternatives, large online platforms should consider providing individuals with an ‘equivalent alternative’ that does not entail the payment of a fee. If controllers do opt to charge a fee for access to the ‘equivalent alternative’, they should give significant consideration to offering an additional alternative. This free alternative should be without behavioral advertising, e.g. with a form of advertising involving the processing of less or no personal data. This is a particularly important factor in the assessment of valid consent under the GDPR.

The EDPB stresses that obtaining consent does not absolve the controller from adhering to all the principles outlined in Art. 5 GDPR, such as purpose limitation, data minimisation and fairness. In addition, large online platforms should also consider compliance with the principles of necessity and proportionality, and they are responsible for demonstrating that their processing is generally in line with the GDPR. 

As regards the need for consent to be free, the following criteria should be taken into account: conditionality, detriment, imbalance of power and granularity. For instance, the EDPB points out that any fee charged cannot make individuals feel compelled to consent. Controllers should assess, on a case-by-case basis, both whether a fee is appropriate at all and what amount is appropriate in the given circumstances. Large online platforms should also consider whether the decision not to consent may lead the individual to suffer negative consequences, such as exclusion from a prominent service, lack of access to professional networks, or risk of losing content or connections.  The EDPB notes that negative consequences are likely to occur when large online platforms use a ‘consent or pay’ model to obtain consent for the processing.

  • Europe Day 2024

Europe Day commemorates the signing of the Schuman Declaration, to celebrate peace and solidarity in Europe. Every year, the EDPB takes part in Europe Day, with an interactive stand manned by volunteers from the EDPB Secretariat and national DPAs, to raise awareness of data protection and to provide information about the EDPB’s activities.

EDPB and EDPS will welcome you in the village “Our strong digital Europe”, showcasing a variety of fun activities to help you learn more about privacy and data protection.

  • EDPB launches French and German versions of its Data Protection Guide for small business

The Guide provides practical information to SMEs about GDPR compliance and benefits in an accessible and easily understandable language.

The development of tools providing practical, easily understandable and accessible data protection guidance is key to reaching a non-expert audience and a strategic objective for the EDPB.

The EDPB Data Protection Guide for small business covers various aspects of the GDPR, from data protection basics, to data subject rights and measures to secure personal data. It contains videos, infographics, interactive flowcharts, and other practical materials to help SMEs on their way to become GDPR compliant

In the near future, the Guide will become available in 15 more European languages.

  • Facial recognition at airports: individuals should have maximum control over biometric data

During its latest plenary, the EDPB adopted an Opinion on the use of facial recognition technologies by airport operators and airline companies to streamline the passenger flow at airports. This Article 64(2) Opinion, following a request from the French Data Protection Authority, addresses a matter of general application and produces effects in more than one Member State.

The Opinion analyses the compatibility of the processing with the storage limitation principle (Article 5(1)(e) GDPR), the integrity and confidentiality principle (Article 5(1)((f)) GDPR, data protection by design and default (Article 25 GDPR) and security of processing (Article 32 GPDR). Compliance with other GDPR provisions including regarding the lawfulness of the processing are not in scope of this Opinion.

There is no uniform legal requirement in the EU for airport operators and airline companies to verify that the name on the passenger’s boarding pass matches the name on their identity document, and this may be subject to national laws. Therefore, where no verification of the passengers’ identity with an official identity document is required, no such verification with the use of biometrics should be performed, as this would result in an excessive processing of data.

In its Opinion, the EDPB considered the compliance of processing of passengers’ biometric data with four different types of storage solutions, ranging from ones that store the biometric data only in the hands of the individual to those which rely on centralised a storage architecture with different modalities. In all cases, only the biometric data of passengers who actively enrol and consent to participate should be processed.

The EDPB found that the only storage solutions which could be compatible with the integrity and confidentiality principle, data protection by design and default and security of processing, are the solutions whereby the biometric data is stored in the hands of the individual or in a central database but with the encryption key solely in their hands. These storage solutions, if implemented with a list of recommended minimum safeguards, are the only modalities which adequately counterbalance the intrusiveness of the processing by offering individuals the greatest control.

Next, a report was adopted by the DPAs on the work of the ChatGPT taskforce. This taskforce was created by the EDPB to promote cooperation between DPAs investigating the chatbot developed by OpenAI.

The report provides preliminary views on certain aspects discussed between DPAs and does not prejudge the analysis that will be made by each DPA in their respective, ongoing investigation.

It analyses several aspects concerning common interpretation of the applicable GDPR provisions relevant for the various ongoing investigations, such as:

• lawfulness of collecting training data (“web scraping”), as well as processing of data for input, output and training of ChatGPT.

• fairness: ensuring compliance with the GDPR is a responsibility of OpenAI and not of the data subjects, even when individuals input personal data.

• transparency and  data accuracy: the controller should provide proper information on the probabilistic nature of ChatGPT’s output and refer explicitly to  the fact that the generated text may be biased or made up.

The report points out that it is imperative that data subjects can exercise their rights effectively.

Taskforce members also developed a common questionnaire as a possible basis for their exchanges with Open AI, which is published as an annex to the report.

Furthermore, the EDPB decided to develop guidelines on Generative AI, focusing as a first step on data scraping in the context of AI training.

Finally, the EDPB adopted a statement on the Commission's "Financial data access and payments package" (which includes the proposals for the Regulation on the framework for Financial Data Access (FIDA), on the Payments Service Regulation (PSR) and on the Payment Services Directive 3 (PSD3)).

The EDPB takes note of the European Parliament’s reports on the FIDA and PSR proposals, but considers that, with regard to the prevention and detection of fraudulent transactions, additional data protection safeguards should be included in the transaction monitoring mechanism of the PSR Proposal. It is important to ensure that the level of interference with the fundamental right to the protection of personal data of persons concerned is necessary and proportionate to the objective of preventing payment fraud.

Data protection in Poland

  • Concern for people's data more important than the administrator's interest

UODO imposed an administrative fine on Santander Bank Polska S.A. for failing to report a data protection breach. Similarly, Toyota Bank Polska S.A. was fined for failing to report a data protection violation.

  • Cryptocurrency bill needs to clarify data protection rules

The law on cryptocurrencies will work better if the rules for processing the personal data of participants in this market are written into it. President of the Office for Personal Data Protection commented on the draft that the government is working on.

The proposed law is to implement the provisions of Regulation 2023/1114 of May 31, 2023 on crypto asset markets, to introduce provisions for the protection of customers and investors and the integrity of the crypto asset market, and to ensure the application of Regulation 2023/1113 With these provisions, the supervision of the crypto asset market is to be more effective, and the protection of customers and investors is to be better. The Office for Personal Data Protection analyzed the February 22 version of the cryptocurrency bill and identified issues that need to be clarified regarding the protection of personal data of participants in this market.

  • The controller must have a basis for processing the data

When acquiring data from a person about his or her state of health, his or her explicit consent is necessary, the Provincial Administrative Court in Warsaw confirmed.

The court dismissed a complaint by the partners of the civil company PIONIER Law Firm against the decision of the President of the Office for Personal Data Protection imposing a penalty on this administrator for having processed personal data without a legal basis.

In its activities, the company reached out to victims, mainly in traffic accidents, to establish cooperation with them in representing them, among other things, before insurance companies, in court cases in order to obtain compensation, damages and pensions in their favor, as well as reimbursement of medical treatment and rehabilitation costs. She obtained information about potential clients based on, among other things, press news, online publications, including content available on social media, as well as information provided or distributed by charitable organizations. During the meeting, a representative of the PIONIER law firm took verbal consent to process personal data pending the possible conclusion of a contract with these individuals for services.

  • Data protection safeguards are important for safe implementation of the Artificial Intelligence Act

The President of UODO, during a meeting of the Permanent Subcommittee of the Polish Parliament on Artificial Intelligence and Algorithm Transparency, pointed out that a large part of the solutions provided by the EU Artificial Intelligence Act relate to the processing of personal data. This is due, among other things, to the need to train artificial intelligence algorithms on data.

The meeting of the Standing Subcommittee was devoted to a presentation by the Minister of Digitization on the regulations contained in the Artificial Intelligence Act (AI Act). The President of the Office of the Public Protection Bureau also shared his insights on the act with MPs and meeting participants.

  • Schengen Evaluation Mission to the DPA

On April 15, 2024, a Schengen evaluation mission began at the Office for the Protection of Personal Data. The Office hosted a team of experts appointed by member states and the European Commission.

The visit is taking place in connection with the implementation of the mechanism for evaluating the correct implementation and application of data protection requirements under the Schengen acquis, to which Poland is subject this year.

The mechanism ensures the effective, efficient and correct application of the Schengen acquis by member states, contributing to maintaining mutual trust between member states. It makes it possible to quickly identify deficiencies in the application of the Schengen acquis that could interfere with the proper functioning of the Schengen area, to ensure their prompt elimination, and to provide a basis for dialogue on the functioning of the Schengen area as a whole.

  • Internet ID for advertising personalization is personal data

The CJEU ruling confirms the correct practice of the supervisory authority in considering identifiers as personal data.

Following the March 7, 2024 judgment of the Court of Justice of the European Union (CJEU) in Case C-604/22 IAB Europe, it is not necessary to change national laws, but the judgment will affect the interpretation of existing data protection regulations.

In a letter to the Minister for European Union Affairs, the President of UODO points out that the CJEU's case law to date shows that the concept of personal data should be understood broadly. They are information in the form of opinions, assessments - including subjective ones - and the condition for them to be considered personal data as defined in the GDPR is that the information concerns a person.

  • Police press releases are not the place to reveal personal information

The President of UODO has initiated administrative proceedings in connection with the disclosure of data by the Krakow Police Department.

The President of the Office for the Protection of Personal Data (UODO), investigated a violation of personal data protection involving the disclosure of sensitive information about an individual in a press release by the Krakow Municipal Police Station. In the course of the investigation, findings were made that determined the President of the UODO to initiate administrative proceedings against the Municipal Police Chief in Krakow, who is the controller of the disclosed data.

The proceedings concern the issue of violation by the Commander of, among others:

  • the principle of processing data in accordance with the law;
  • the principle of ensuring adequate security of data;
  • and the processing and disclosure of data without a legal basis.
  • UODO president took part in the work of the parliamentary subcommittee on artificial intelligence

The Office for Personal Data Protection wants to support legislators in their work on solutions regulating artificial intelligence. For artificial intelligence, personal data is key, because it learns from it very often.

The President of the DPA, on May 9, 2024, briefed members of the standing subcommittee on artificial intelligence and transparency of algorithms on work at the Office on a proposal for guidance on the design and adaptation of national law to data protection requirements in connection with the use of artificial intelligence systems. The proposals in the guidance can be used by the Ministry of Digitization and other ministries and parliament in legislative work.

The right to data protection is a key element in the management and regulation of artificial intelligence systems, which requires constant attention to and compliance with data protection rules by all entities that process personal data using these technologies. Therefore, according to the President of the DPA, it is necessary to agree on a common approach to the protection of personal data under the GDPR and the European Artificial Intelligence Act. In today's world, such GDPR principles as:

  • Data Minimization, Purpose Limitation, Accuracy, Integrity and Confidentiality, are essential for the proper implementation of AI;
  • Focusing on people's rights to access, rectification, erasure, restriction of processing of their data; and the right to object. AI systems must be designed to enable these rights to be realized in an easy and accessible manner;
  • Of particular importance in the context of AI is Article 22 of the GDPR, which governs the right not to be subject to a decision based solely on automated processing, including profiling, that produces legal effects in relation to a person or similarly significantly affects a person.
  • Nearly PLN 240,000 fine for company whose employee lost pendrive with personal data

An employee of the catering company, lost a pendrive with personal data. The President of the Office for Personal Data Protection (UODO) determined that the way personal data was processed at this company was not in compliance with the applicable provisions of GDPR, due to an incorrectly conducted risk analysis, which did not foresee the danger of losing the data carrier. As a result, adequate organizational and technical measures were not applied to ensure safe data processing.

An employee of company lost a pendrive that contained unencrypted files containing another employee's personal data, namely name, address, citizenship, gender, date of birth, PESEL number, passport series and number, phone number, email address, photos and salary data. The pendrive also contained encrypted files with financial data.

In the course of the investigation, the company showed that it had documents such as a risk register and confirmations that it had carried out monitoring of GDPR procedures. The problem, however, turned out to be the rules for using external data storage media, including their encryption. The company informed employees about how to encrypt files in an instructional video. And this, the DPA noted, shifted responsibility to them for how they processed data.

What was the problem?

  • The President of the DPA found that the company had misjudged the risk to the data. It assumed that data carriers could be stolen or destroyed - but failed to take into account that a carrier could simply be lost without malicious intent.
  • On top of this, despite assuming the occurrence of various events, cryptographic solutions for protecting personal data on external media were not implemented. An instructional video on "how to encrypt files on a pendrive and what program to use for this purpose" is not enough in view of the scope of data processed on such media.
  • Another problem was that the company failed to fulfill its obligation to regularly measure, test and evaluate the effectiveness of the security measures used.
  • Children's safety in the face of new technologies

A cell phone, a smartwatch, an interactive toy that can access the Internet are gifts that are increasingly replacing traditional toys like mascots, balls or bicycles. On the occasion of Children's Day, it is worth considering whether a particular gadget is appropriate for our kid, and what we need to pay attention to so that the use of such devices does not violate privacy or data collect.

  • Meta stops use of user data to train artificial intelligence

Ireland's Data Protection Commission has held discussions with META representatives, resulting in the company suspending the use of personal data to train its artificial intelligence-based language models. META will not use content shared publicly by adult users of the company's social networks within the European Union and the European Economic Area for this purpose.

According to META's new privacy policy, until June 26 this year Facebook users could fill out a special form and object to such use of their personal data. Its inclusion, however, was subject to certain restrictions, such as the need to indicate justification.

Complaints about the new privacy policy through the organization NOYB (None Of Your Business), also known as the European Center for Digital Rights, have reached 11 data protection authorities, including the President of the Data Protection Authority.

The Irish DPA will continue to hold discussions with META on the present issue. It will also keep the other DPAs informed of the results of these discussions.

The President of the DPA notes that the processing of personal data using artificial intelligence mechanisms must respect data protection laws, and supervisory authorities will jointly take action and support each other in situations where there is a risk to the privacy of data subjects.

  • The controller has a duty to cooperate with the President of the Office for Personal Data Protection - WSA confirms

The administrator is obliged, at the request of the President of the Office for Personal Data Protection (UODO), to provide all information necessary to carry out its tasks, the Provincial Administrative Court in Warsaw (hereinafter: "WSA in Warsaw") has ruled.

Data protection in The Netherlands 

  • Booking.com reports data leaks on time after AP intervention

The Dutch Data Protection Authority (AP) is completing a period of more intensive supervision of Booking.com. In 2023, the AP monitored for a year whether Booking.com properly adhered to the rules regarding reporting data leaks, because the AP had indications that Booking.com previously did not always report data leaks on time. The AP now concludes that Booking.com complied well with the rules regarding reporting data leaks in 2023.

  • AP points out to political parties the risks of personal data surrounding elections

The elections for the European Parliament will take place on June 6, 2024. In a letter, the Dutch Data Protection Authority (AP) draws attention to the privacy rules to the Dutch political parties participating in the elections. 

Political parties are increasingly processing personal data, for example to conduct (targeted) campaigns or to recruit members. They must comply with the General Data Protection Regulation (GDPR). For example, parties collect and process data when they use social media, microtargeting or political advertisements.

  • AP warns: risks of cyber-attacks are often underestimated

Too many organizations in the Netherlands that are hit by a cyber-attack fail to warn people that their data has fallen into the wrong hands. Organizations often estimate the risks of the attack too low – in 7 out of 10 cases. As a result, the people whose personal data has been leaked cannot protect themselves against possible fraud or other crimes by cyber criminals.

The Dutch Data Protection Authority (AP) warns about this in the annual overview of data breach reports in the Netherlands. "Don't underestimate it, with your data in hand, criminals can really do you harm," explains AP chairman Aleid Wolfsen. 'With your telephone number or email address, they can send you payment requests that you may accidentally click on. With a copy of your passport, someone else can take out a loan in your name. Your data is worth gold to criminals.' 

In total, the AP received more than 25,000 reports of a data breach in 2023. All in all, approximately 20 million people became victims.

  • AP identifies privacy risks in the workplace and in social security

The Dutch Data Protection Authority (AP) has identified trends and developments in the field of privacy that play a role in the labor market and social security. The AP sees a number of major risks, including due to algorithms and artificial intelligence (AI).

The AP notes in the Labor and Social Security Sector Image that:

  • There is increased use and experimentation with algorithms and AI across the board.
  • Employers can keep a closer eye on their employees with algorithms.
  • Employers also monitor staff in other ways, for example through cameras and sensors in the workplace and with alcohol and drug tests.
  • Government agencies are also increasingly using algorithms and AI, for example to detect benefit fraud.
  • AP: Government, do not use Facebook if there is uncertainty about privacy

Government organizations should not use Facebook if it is unclear what happens to the personal data of visitors to their Facebook page. The government must be able to guarantee that the processing of this data complies with the law. The Dutch Data Protection Authority (AP) advises the Ministry of the Interior (BZK) on this.

  • AP: scraping is almost always illegal

Scraping is the automatic collection and storage of information from the Internet. Scraping by private parties and private individuals is almost never allowed. The Dutch Data Protection Authority (AP) states this in a new guide. In practice, these parties can only use scraping legally if they do so in a very targeted manner.

With scraping, a computer program automatically 'scrapes' data from the Internet. For example, by scanning social media. Scraping almost always collects personal data. This creates privacy risks. Scraping can collect personal data from many people in a short time. The information recorded during scraping can relate to many aspects of a person's life. The information can also be of all kinds special personal data and criminal personal data which usually should not be collected and used.

  • New rules for credit registration are inadequate

At the insistence of the Dutch Data Protection Authority (AP), the government is introducing new rules for credit registration in the Netherlands. That in itself is a good development. However, the new rules do not yet sufficiently protect the rights of data subjects. This concerns financial information of millions of people. The AP warns about this after assessing the proposal for the Credit Registration System Act. 

This bill clearly needs to be improved on a number of points, the AP concludes. For example, the Credit Registration Office (BKR) must not store people's privacy-sensitive data for too long. BKR is the private organization that manages the credit registration system in the Netherlands.

  • Recruitment company fined for ignoring deletion requests

The Dutch Data Protection Authority (AP) has imposed a fine of 6,000 euros on recruitment company Ambitious People Group (APG). APG is being fined because the company had not deleted the data of 3 different people after they had requested it. 

Job seekers can register with APG* if they are interested in mediation by this recruitment agency. People can of course also request that their personal data be deleted, for example if they no longer want mediation. But that did not go well for several people. Names, home addresses, e-mail addresses, telephone numbers, dates of birth and CVs containing information about education and work experience remained in the APG database after the people requested their removal. APG also approached these people about vacancies.

  • AP and RDI: Supervision of AI systems requires cooperation and must be arranged quickly

In the supervision of artificial intelligence (AI), cooperation between supervisors must be paramount, the Dutch Data Protection Authority (AP) and the National Digital Infrastructure Inspectorate (RDI) write in an advice to the cabinet. It must also quickly become clear which authorities will carry out the various parts of the supervision. The first parts of the new European AI regulation will come into effect at the beginning of 2025.  

  • AP: Education may only use social media with clear agreements

Educational institutions may only use social media if they make clear agreements with social media companies about what happens to the data of students and teachers. Is it not possible to make such agreements? Then it is better for an educational institution not to use the social medium in question, according to the Dutch Data Protection Authority (AP).

  • AP: more clarity is needed about approaching people entitled to benefits or allowances

The bill that gives government agencies the opportunity to approach people who are entitled to benefits or allowances still needs changes. This is the opinion of the Dutch Data Protection Authority (AP). For example, people must receive clear information in advance about which personal data authorities will exchange for this purpose.

The AP concludes this after assessing the proposal for the SZW Proactive Services Act, which amends the Work and Income Implementation Organization Structure Act (SUWI). 

Within the existing system for work, income and social security, the UWV, the SVB and municipalities already exchange certain personal data. The intention of the bill is that they will soon also be able to exchange data to determine whether people may miss out on benefits or social services. And if so, to approach those people.

Data protection in Spain

  • Data protection control authorities publish guidelines for treatments that incorporate Wi-Fi tracking technologies

The Spanish Data Protection Agency, the Catalan Data Protection Authority, the Basque Data Protection Authority and the Transparency and Data Protection Council of Andalusia have prepared some Guidance on treatments incorporating Wi-Fi tracking technology or Wi-Fi tracking in which they analyze the implications of this technology, identify the main risks and offer a series of recommendations for responsible use compatible with data protection regulations.

Wi-Fi tracking is a technology that allows mobile devices to be identified and tracked through the Wi-Fi signals they emit, detecting the presence of the device in a specific area and identifying movement patterns. Practical applications can be found in shopping centers, museums, work centers, public areas, transportation or large events, being used to estimate capacity, analyze people flows or measure dwell times.

The data protection authorities state that the use of this technology may involve the processing of personal data and, therefore, must be subject to the set of principles, rights and obligations established in the General Data Protection Regulation. Furthermore, its use poses serious privacy risks, as it could allow people's movements to be tracked without them being aware of it and without an appropriate legal basis.

Therefore, the authorities consider that, given the inherent risk factors and elements, in general, the conditions are met so that before carrying out the treatment it is mandatory to carry out a Data Protection Impact Assessment (DPIA). In fact, taking into account the risk factors, they recommend carrying it out even when the person responsible for the treatment may not be clear about its obligation. Furthermore, to use these technologies it is necessary to intensify compliance with the principle of transparency through clear and accessible information, such as visible panels with information, public signage, voice alerts or information campaigns, among others.

The guidelines also include a list of measures to implement if all requirements for compliance with the principles of the GDPR are exceeded, highlighting, among others, anonymizing and aggregating just after data collection, limiting the scope in which tracking is carried out Wi-Fi, do not assign the same identifier to a mobile device on different visits to the same place, implement security measures adapted to the level of risk and subject to continuous reviews or carry out independent audits. 

  • The AEPD launches a new version of its Manage RGPD tool

The Spanish Data Protection Agency (AEPD) has launched a new version of Manage GDPR, a tool that helps manage personal data processing, evaluate and manage risks through a catalog of privacy measures and, if necessary, assist in carrying out impact assessments. Gestiona3 RGPD is aimed at data controllers and processors, as well as data protection officers . The new version expands the catalog of privacy measures applicable to mitigate the risks identified in the treatment and includes improvements in the editing of final reports, among other possibilities.

The General Data Protection Regulation (GDPR) establishes that organizations that process personal data must maintain an Activity Register (RAT) and identify and manage the risks that such processing may have for the rights and freedoms of the people whose data is being processed. treating. The objective is to select and implement appropriate measures to minimize each risk detected. At the same time, when this analysis reveals that there is a high risk for the protection of individuals, the GDPR establishes that these organizations must carry out a data protection impact assessment (DPIA).

Manage GDPR allows you to manage the record of processing activities of an organization, with up to 500 treatments in an integrated manner, as well as of different entities . It includes functions to identify risk factors for people's rights and freedoms and make a first assessment of intrinsic risk. These functions allow you to manage risk with privacy measures that the tool itself suggests for each identified risk factor, as well as measures for managing personal data breaches and security, and organizational measures and data protection policies.

The new version of Gestiona goes from more than 500 measures to almost 800 classified according to the risk factors previously identified by the entity, also including aspects such as governance, security and data breaches. In this way, the selection of risk factors and measures to mitigate them constitute a broad starting point for the risk identification and management processes that are necessary to comply with the risk approach included in the GDPR. Likewise, Gestiona is a useful tool for those organizations that need to start carrying out Impact Assessments on the Protection of Personal Data when, from the risk analysis carried out, it appears that the treatment may pose a high risk to the rights and freedoms of the people whose data is processed.

The management of the treatment is carried out on the user's device through their browser , without installing any type of application, storing the information locally, allowing the data of different controllers to be managed and without transmitting information to the Agency, or to third parties, guaranteeing confidentiality. The information can be stored in a file on the user's computer and retrieved after each session, allowing for different versions. The new version includes improvements in the editing of the reports that the tool produces when the process is completed. In addition, the Agency has carried out an analysis of the queries that the data controllers have posed to it and has included answers to all of them in a new user guide which includes issues related to the scope of the tool, the storage and conservation of the processed information or measures to mitigate the identified risk factors.

Data protection in Mexico

  • Public and private institutions must strengthen security infrastructure to avoid personal data leaks

From January to December 2023, the National Institute for Transparency, Access to Information and Protection of Personal Data  Public institutions and the private sector must reinforce their security infrastructure to avoid leaks of personal data and, with this, affect people's privacy, said the President Commissioner of the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI), Adrián Alcalá Méndez, while participating in the forum The virtual is real, organised by the Senate of the Republic.

  • INAI promotes rights of access to information and protection of personal data among young university students

With the aim of promoting the rights of access to information and protection of personal data among the country's youth, the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) and the Chihuahua Institute for Transparency and Access to Public Information (Ichitaip) held the National Platform for Transparency, Data Protection and Artificial Intelligence conversation at the Autonomous University of Ciudad Juárez. Commissioner Norma Julieta del Río Venegas emphasised that INAI promotes the rights of access to information and protection of personal data through various means, including the Centro de Atención a la Sociedad (CAS), the virtual assistant CAVINAI and the APP of the Plataforma Nacional de Transparencia (PNT), technological innovations that allow the exercise of rights from anywhere in the country and the world, immediately.

  • Archives are essential for transparency, accountability and the fight against corruption.

The National Institute for Transparency, Access to Information and Protection of Personal Data (INAI), the General Archive of the Nation (AGN) and the National Transparency System (SNT) promote the professionalisation of the country's public servants in the field of document management, as a way to guarantee the right to know and privacy, as well as accountability and the fight against corruption at local and national level. In her welcome message to the fourth day of the Caravana Archivística, which took place in the state of Guerrero, with the theme Catálogo de Gestión Documental, INAI's Commissioner, Josefina Román Vergara, stressed that the topic addressed is essential to guarantee the right of access to public information, as it is related to the fulfilment of common transparency obligations.

  • INAI promotes the right to protection of personal data in the state of Tamaulipas with the Privacy Route

With the purpose of promoting the right to the protection of personal data in the state of Tamaulipas, the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) carried out the Privacy Route in the state, with the theme of neuro-rights, in coordination with the Institute for Transparency, Access to Information and Protection of Personal Data of the State of Tamaulipas (ITAIT). In her message, Josefina Román Vergara, INAI Commissioner and promoter of the Privacy Route, pointed out that, lately, the concept of neuro-rights has generated controversy between technological innovation, ethics and justice in the digital ocean, where, as brain-computer interfaces advance, it is imperative to establish a legal and ethical framework that guarantees that thoughts and emotions are not exploited, manipulated or violated without consent.

  • National Transparency Platform, essential to guarantee access to information and protection of personal data in Mexico

The National Transparency Platform (PNT) has established itself as an indispensable tool to guarantee the rights of access to information and protection of personal data in Mexico, as evidenced by the 9 million 577,189 requests for information that have been submitted since its birth in 2016 until 6 May, said the Plenary of the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI). At the commemoration of the 8th anniversary of the PNT, INAI Commissioner and Coordinator of the Permanent Commission on Information Technologies, Norma Julieta del Río Venegas, stressed that the population has found in the PNT a way to exercise their rights as shown by the 262,769 review appeals filed since its creation, as well as the 17 million queries in the Thematic Search Engines and the 30 million downloads in open data.

  • INAI issues 10 recommendations to protect your privacy on World Internet Day

This World Internet Day, the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) issues 10 recommendations to reduce the risks that may arise when surfing the net, such as privacy violations, cyberbullying and fraud.

In order to reduce the risks when surfing the Internet, INAI makes the following recommendations:

1. Surf safely. Access pages of recognised institutions or companies and always check that the toolbar displays a green padlock that certifies the authenticity of the page.

2. Choose strong passwords. Combine uppercase and lowercase letters, symbols and numbers. It is best not to use the same password for all accounts, nor include dates of birth, anniversaries or telephone numbers.

3. Set up your privacy settings on social networks. Social networks have options through which you can choose who sees the content you publish. It is advisable to check these sections and make sure they have the level of privacy you have chosen.

4. Protect access to devices and accounts. Mobile devices offer options to set passwords, it is advisable to activate them to increase the level of security.

5. Download software and applications from official sites. This way you can verify that they meet the required security conditions.

6. Be wary of strangers. Never open files of dubious origin or accept unknown people on social networks.

7. Keep your software and applications up to date. Check that your operating system has the latest updates and security patches.

8. Encrypt information. There are several programs that allow you to encrypt information to prevent others from accessing the data.

9. Take care of your physical environment. Avoid accessing your bank accounts or making purchases on public computers.

10. Close all sessions. Delete your browsing history when using shared computers, as well as downloaded files to prevent others from viewing them.

  • Artificial intelligence must respect privacy

It is necessary to design artificial intelligence tools with rules that guarantee privacy and the protection of personal data, said Adrián Alcalá Méndez, President Commissioner of the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI), while participating in the conference "Harmonising artificial intelligence and the protection of personal data in modern document management". "Artificial intelligence makes decisions and executes tasks based on the knowledge obtained. It is necessary to feed artificial intelligence with rules that the algorithm must obey at any moment of the process, in order to guarantee the correct treatment of personal data," said Alcalá Méndez. The activities organised by the State Institute of Transparency, Access to Information and Protection of Personal Data (InfoNL) were also attended by INAI commissioners Josefina Román Vergara and Blanca Lilia Ibarra Cadena.

  • Guide to configure the privacy of your TikTok account

TikTok is a popular content application that allows users to create and share short videos. It is important to protect your privacy and your personal information when using TikTok. The National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) suggests some tips on how to protect your privacy and personal information when using TikTok.

Personal Data Protection (INAI) suggests some privacy measures to set up your account on this digital platform.

1. Control who can see your videos. You can manage the viewing controls of your account and your videos in the "Privacy" section found in your profile settings. There you can choose to make your account private, so that only people you approve of can follow you and view your videos, or you can choose to make your account private, so that only people you approve of can follow you and view your videos or who can comment on your posts: whether all TikTok users or only your own users.

2. Protect your personal information. Only share personal information on your profile information on your profile. Avoid posting your address or phone number. Make sure your username does not name does not reveal personal information. Never share your password.

3. Be careful what you post. When you post content on TikTok make sure it is appropriate and does not appropriate and does not reveal personal information. You should also be careful about what you post about other people.

4. Set up two-step verification. TikTok offers a two-step verification feature to increase the security of your account. Set it up to add an extra layer of protection.

5. Review the settings of third-party apps. If you use third-party apps on TikTok, be sure to review the privacy settings. Some of them may request access to your personal information, which can be dangerous.

6. Log out of your session. Always log out of TikTok when you are done using the app. This will help protect your account from people who may have access to your mobile phone.

Data protection in India

  • Data protection, AI rules top tech’s Wishlist for next govt

India’s first data protection law has been in suspended animation for the last nine months for the lack of subordinate rules to enforce the legislation, according to technology policy lawyers, think tanks and policy groups.

The other key legislation that requires urgent attention is the Digital India Bill (DIB), which would replace the Information Technology (IT) Act of 2000 and revisit fundamental concepts like intermediary liability, fake news and deep fake content.

Regulating artificial intelligence (AI) is also needed, especially after the EU wrapped up approval of the world’s first law governing AI development. The DIB is likely to regulate AI among other things.

Draft DIB - the DIB is aimed at bringing together rules, guidelines and laws governing internet intermediaries, and as such is an important initiative in terms of re-evaluating platform immunity, ease of doing business and protecting from harm. New government initiate consultations on the DIB, since the IT Act needs an urgent overhaul to keep up with the times.

Regulating AI - The government would need to explore the impact of AI on the domestic economy, jobs, intellectual property rights, data protection, liability of AI agents, and social harms in the form of misinformation and synthetic content.

AI regulation, the issues cross over between consumer protection and intellectual property and require careful deliberation. “As in many other regulatory initiatives, the European Union has taken the lead and that will help in empirical understanding of potential solutions to the inevitable rise of artificial intelligence.

Data protection in Brazil

  • ANPD establishes Process Governance Methodology

The National Data Protection Authority (ANPD) published Resolution CD/ANPD nº 14/2024, which establishes the Process Governance Methodology within the scope of the Authority. The instrument is provided for in the ANPD's Process Governance Policy and is essential for its full implementation. 

The Methodology directs process governance in ANPD units in a coordinated and consistent manner , aiming to improve processes and meet the institution's strategic planning. The initiative is related to the public governance guidelines, established in Decree No. 9,203 of November 22, 2017, which provides for the governance policy of direct, autonomous and foundational federal public administration.  

  • ANPD establishes integrity program and creates management committee

The National Data Protection Authority (ANPD) published Resolution CD/ANPD nº 12, of April 9 , 2024 , which establishes its Integrity Program. The objective is to promote compliance of conduct, transparency, prioritization of public interest and an organizational culture focused on delivering value to society. 

According to the document, the operationalization of the initiative will take place based on an Integrity Plan, which will define the measures to be adopted based on the assessment of risks to integrity. Some of the aspects to be observed are the existence of possible conflicts of interest, the prevention of moral and sexual harassment, the strengthening of transparency measures, and the establishment of ways of monitoring and monitoring the Plan. 

Resolution CD/ANPD No. 13 was published, which establishes the Commission for Integrity, Transparency and Access to Information. The collegiate is permanent and has the objective of coordinating, monitoring, supervising, monitoring and evaluating matters related to integrity, transparency and access to information within the scope of the Authority, acting as a sectoral unit of the Integrity, Transparency and Access to Information System ( Sitai ) , established by Decree No. 11,529, of May 16, 2023. This Commission will also be responsible for preparing, monitoring and monitoring the Integrity Plan that will operationalize the ANPD Integrity Program. 

  • Director of ANPD defends the Authority's leading role in regulating AI

The Director of the National Data Protection Authority (ANPD) Miriam Wimmer participated in the 1st International Seminar on Artificial Intelligence and Law. The event , which took place from the 11th to the 13th, was promoted by the Department of Law of the Pontifical Catholic University of Rio de Janeiro (PUC-Rio) with support from the Support Program for Events in the Country (P aep) of the Foundation Coordination for the Improvement of Higher Education Personnel (C apes).

The server addressed the topic of Institutional Arrangements for the Regulation of Artificial Intelligence (AI). According to her, it is important for the country to advance in the development of policies, regulations and governance systems that ensure ethics, privacy, transparency, security and responsibility. These aspects present themselves today as the main challenges faced by Brazil in terms of regulation and have been the subject of concern by all sectors interested in discussing the regulation of AI.  

  • ANPD approves the Security Incident Communication Regulation

The National Data Protection Authority (ANPD) published Resolution No. 15/2024, which approved the Security Incident Communication Regulation (RCIS). The regulations have the objectives of mitigating or reversing losses generated by incidents; to ensure accountability and accountability; to promote the adoption of good governance, prevention and security practices; and to strengthen the culture of personal data protection in the country.

The RCIS provides that the controller must inform the ANPD and the data subject about the occurrence of security incidents that may cause relevant risk or damage. The obligation is directly related to the possible harm to the interests and fundamental rights of holders and the involvement of sensitive personal data, those of minors, financial data, system authentication data, protected by secrecy or processed on a large scale.

The regulation also sets out the deadlines for the controller to carry out the communication and what information must be forwarded. The regulations also make it mandatory to keep a record of security incidents involving personal data for at least five years.

The approval and publication of the Security Incident Reporting Regulation strengthens the protection of data subjects' rights, as it is a catalyst in implementing the general protection principles established in the LGPD, in particular the principle of transparency, given the need to provide information clear to holders whose personal data were the subject of an incident.

  • ANPD presents proposals to amend the substitute for PL 2338, on artificial intelligence

The National Data Protection Authority (ANPD) delivered, to Senator Eduardo Gomes (PL-TO), president of the Temporary Commission on Artificial Intelligence in Brazil, a document containing contributions from the Authority for the PL Replacement 2338. The project, presented last year by the president of the Senate, Rodrigo Pacheco (PSD-MG), is under analysis along with nine other proposals, including one already voted on in the Chamber of Deputies. 

The changes proposed by the Authority include changes to the definitions of some terms, changes to the approach to data subject rights, issues related to biometric systems, classification of high-risk systems, and regulation and governance of artificial intelligence. The 25-page document also suggests changes to the regulation and standardization process, administrative sanctions, rules for executing a regulatory sandbox , in addition to the deadline for appointing the competent authority. 

The proposals, prepared by a multidisciplinary team of civil servants, reflect the understanding that artificial intelligence requires multisectoral governance. According to this point of view, there would be a need for central coordination that harmonizes guidelines and regulatory actions at a national level. 

“ The ANPD stands out as the most suitable entity to lead this process, given its mandate to ensure the constitutional right to the protection of personal data and its technical - regulatory expertise ", argues Waldemar Gonçalves , CEO of the ANPD.  

  • ANPD updates Personal Data Protection and Privacy Glossary

The National Data Protection Authority (ANPD) published an updated version of the Data Protection and Privacy Glossary. In addition to having new entries, the new edition is available internally on the Authority's website, facilitating access to the content. 

The glossary aims to consolidate, in a single instrument, concepts previously dispersed in a variety of normative acts and guidelines issued by the NPD. In this way, access and understanding of legal and technical terms essential for the interpretation and application of the General Data Protection Law (L GPD) will be simplified.  

Prepared by the General Coordination of Standardization (CGN), the document is dynamic and will be updated according to news in the sector. It is also a measure that aims to disseminate knowledge about the protection of personal data and the standardization of procedures that facilitate the control of personal data by data subjects .

  • Article in the GPA Newsletter defends the choice of ANPD as the central regulatory agency

The Director-President of the National Data Protection Authority (ANPD), Waldemar Gonçalves, and the Project Manager of the Board of Directors of the Autarchy Jeferson Barbosa sign an article published in the May edition of the GPA Newsletter , in which they defend the choice of the Autarchy as an institution competent to regulate artificial intelligence (AI). 

The article provides a brief account of the impact of the new technology and reviews the history of Bill 2338, presented in 2023, for which the ANPD proposes a governance model based on four aspects: advisory council with participation from society ; Federal Executive Branch , responsible for formulating and implementing public policies ; ANPD , in the role of central authority of the regulatory system ; and the creation of a regulators' forum, composed of the Authority and other regulators.  

The authors emphasize, however, that the ANPD's concern with AI predates the PL. In 2022, the topic was included as a priority in its Regulatory Agenda for the 2023-2024 biennium. The Sandbox project , put out for public consultation in October 2023, also deserves attention. “The initiative is important to support future AI regulation and stimulate responsible innovation,” say the authors. The project was also mentioned in the World Economic Forum's Artificial Intelligence Governance Report. 

In the end, it is clear that the ANPD, due to its technical capacity and knowledge, is the ideal body to play the leading role in the regulation of AI in Brazil. 

The GPA Newsletter is a publication of the Global Privacy Assembly, one of the most important global forums on privacy and data protection.  

  • Biometrics is the theme of the second volume of the Technological Radar series

The National Data Protection Authority (ANPD) launched the second volume of the Technological Radar series, on Biometrics and Facial Recognition . The first edition, published in January, addressed the topic of Smart Cities.   

Conducted by the Authority's General Coordination of Technology and Research (CGTP), the study highlights the growing relevance and popularity of biometric and facial recognition technologies that have evolved significantly in recent years . It addresses the applications and impacts of these technologies in various sectors, such as education and public security, in addition to discussing the risks and challenges associated with privacy and protection of personal data. 

The research showed that , despite promising advances, it is necessary to take into account that biometric information is sensitive personal data and that possible errors can cause considerable harm to the wrongly identified person. Furthermore, they may reflect discriminatory aspects in certain social segments.  

The topic is on the agenda of the ANPD's Regulatory Agenda for the 2023–2024 biennium, as it raises significant concerns about the privacy and protection of holders' personal data – especially in relation to the growing popularity of facial recognition in means such as access to electronic devices , entry into public spaces and the exercise of rights.

If you have any questions, please send us an email to datasecurity@catts.eu

CATTS Support

How can we help?

CATTS is your dedicated partner for comprehensive data protection and compliance solutions. From strategic guidance and customized training to data security assessments and regulatory monitoring, we empower businesses for ethical success in the digital age. Whether it's GDPR compliance, Privacy Impact Assessments, or incident response, CATTS ensures tailored strategies to your unique data protection needs.

Contact Us