Regulators have entered 2025 with a sharper focus on enforcement, cooperation, and guidance. In this edition of Global Data Protection Compliance FY25/01, we explore how data protection authorities across the EU, Brazil, India, Mexico, Poland, the Netherlands, and Spain are responding to the growing impact of AI technologies, cross-border data challenges, and the need for stronger coordination between regulatory bodies. The updates reflect not just legal shifts, but a deeper push to clarify standards and close compliance gaps in both the public and private sectors.
- Data protection in the EU
- Data protection in Poland
- Data protection in The Netherlands
- Data protection in Spain
- Data protection in Mexico
- Data protection in India
- Data protection in Brazil

Data protection in the EU

- EDPB adopts pseudonymization guidelines and paves the way to improve cooperation with competition authorities
During its January 2025 plenary meeting, the European Data Protection Board (EDPB) has adopted guidelines on pseudonymization, as well as a statement on the interplay of competition law and data protection.
The GDPR introduces the term ‘pseudonymization’ and refers to it as a safeguard that may be appropriate and effective to meet data protection obligations. In its guidelines, the EDPB clarifies the definition and applicability of pseudonymization and pseudonymized data, and the advantages of pseudonymization.
The guidelines provide two important legal clarifications:
1. Pseudonymized data, which could be attributed to an individual by the use of additional information, remains information related to an identifiable natural person and is therefore still personal data. Indeed, if the data can be linked back to an individual by the data controller or someone else, it remains personal data.
2. Pseudonymisation can reduce risks and make it easier to use legitimate interests as a legal basis (Art. 6(1)(f) GDPR), as long as all other GDPR requirements are met. Likewise, pseudonymization can aid in securing compatibility with the original purpose (Art. 6(4) GDPR).
The guidelines also explain how pseudonymization can help organizations meet their obligations relating to the implementation of data protection principles (Art. 5 GDPR), data protection by design and default (Art. 25 GDPR) and security (Art. 32 GDPR).
Finally, the guidelines analyze technical measures and safeguards, when using pseudonymization, to ensure confidentiality and prevent unauthorized identification of individuals.
The guidelines will be subject to public consultation until 28 February 2025, providing stakeholders with the opportunity to comment and allowing for the incorporation of future developments in case law.
- Interplay between data protection law and competition law: the EDPB’s take on how to improve cooperation between regulators
During the plenary meeting, the EDPB also adopted a position paper on the interplay between data protection law and competition law.
The CJEU Meta vs. Bundeskartellamt ruling of 4 July 2023 clearly indicated that data protection and competition authorities are required to work together, in some cases, to achieve effective and coordinated enforcement of data protection and competition law. While these are separate areas of law pursuing different goals in different frameworks, they may in some cases apply to the same entities. It is therefore important to assess situations where the laws may intersect.
In this position paper, the EDPB explains how data protection and competition law interact. It suggests steps for incorporating market and competition factors into data protection practices and for data protection rules to be considered in competition assessments. It also provides recommendations for improving cooperation between regulators. For example: authorities should consider creating a single point of contact to manage coordination with other regulators.
- CEF 2024: EDPB identifies challenges to the full implementation of the right of access
The European Data Protection Board (EDPB) has adopted a report on the implementation of the right of access by controllers. The report summarizes the outcome of a series of coordinated national actions carried out in 2024 under the Coordinated Enforcement Framework (CEF). It lists the issues that were observed for some controllers, along with a series of recommendations to help them implement the right of access. A central element is controllers’ awareness of the EDPB Guidelines 01/2022 on data subjects rights – Right of access and whether these guidelines were followed in practice.
EDPB Deputy Chair Zdravko Vukíc said: “The CEF is a valuable initiative that helps strengthen the cooperation among Data Protection Authorities (DPAs): by tackling selected topics in a coordinated fashion, they achieve greater efficiency and more consistency. How controllers implement the right of access lies at the heart of data protection and it is one of the most frequently exercised data subject rights.”
Throughout 2024, 30 DPAs across Europe launched coordinated investigations into the compliance of controllers with the right of access, by opening formal investigations, assessing whether a formal investigation was warranted and/or carrying out fact-finding exercises. A total of 1,185 controllers, consisting of small and medium-sized enterprises (SMEs) and big companies active in different industries and fields, as well as various types of public entities, responded to the action.
Areas of improvement and main challenges:
The results suggest that more awareness raising about Guidelines 01/2022 is necessary, both at national and EU level, as the guidelines help controllers implement the right of access, explain how exercising this right can be made easier, and list the exceptions and limitations of the right to access.
As a result of the 2024 CEF action, seven challenges were identified. One of them is the lack of documented internal procedures to handle access requests. In addition, inconsistent and excessive interpretations of the limits to the right of access were also observed, such as overly relying on certain exceptions to automatically refuse access requests. Another example is the barriers that individuals could encounter when exercising their right of access, such as formal requirements or being requested to provide excessive identification documents. For each challenge identified, the report provides a list of non-binding recommendations to be taken into account by controllers and DPAs.
Positive findings
Despite the existing challenges, two thirds of participating DPAs evaluated the level of compliance of responding controllers with respect to the right of access from ‘average’ to ‘high’. One important factor identified as having an impact on the level of compliance was the volume of access requests received by controllers, as well as the size of the organization. More specifically, large-sized controllers or controllers receiving more requests were more likely to reach a higher level of compliance than small organizations with less resources.
Positive findings were observed across Europe. These include the implementation of best practices by controllers, such as user-friendly online forms enabling individuals to submit an access request easily as well as self-service systems to allow individuals to autonomously download their personal data in a few clicks and at any time.
- Stay in control of your personal data. Happy Data Protection day 2025!
If someone asked you to answer 100 questions about your personal life to sell the answers, would you agree? Most likely not.
It can be difficult to keep in control over your personal data and to keep it safe. From online shopping and browsing to social media, with every click, share and login-in you leave behind a digital trail. The GDPR ensures that your data can only be used in ways you agree to and that you can access any information about yourself.
But do people actually know how to protect their data? We asked passers-by on the streets of Brussels: Link
- EDPB adopts statement on age assurance, creates a task force on AI enforcement and gives recommendations to WADA
During its February 2025 plenary meeting, the European Data Protection Board (EDPB) adopted a statement on age assurance and decided to create a taskforce on AI enforcement. In addition, the Board also adopted recommendations on the 2027 World Anti-Doping Agency (WADA) World Anti-Doping Code.
In a statement on age assurance, the EDPB lists ten principles for the compliant processing of personal data when determining the age or age range of an individual. The statement aims to ensure a consistent European approach to age assurance, to protect minors while complying with data protection principles.
The EDPB is also cooperating with the European Commission on age verification in the context of the Digital Services Act (DSA) working group.
During the plenary, the Board also decided to extend the scope of the ChatGPT task force to AI enforcement. In addition, the EDPB members underlined the need to coordinate DPAs' actions regarding urgent sensitive matters and for that purpose will set up a quick response team.
During the plenary, the EDPB also adopted recommendations on the 2027 WADA World Anti-Doping Code. When processing personal data for anti-doping purposes, it is essential to respect and safeguard the personal data of athletes. In many cases, this will involve the processing of sensitive personal data, such as health data derived from biological samples.
The EDPB’s main objective is to assess the compatibility of the WADA Anti-doping Code and International Standard for Data Protection (ISDP) with the GDPR. The Anti-doping Code and Standards should hold the National Anti-Doping Organizations (NADOS) subject to a standard equivalent to that of the GDPR when processing personal data for anti-doping purposes.
The EDPB’s recommendations address key principles of data protection, such as the need for an appropriate legal basis for the processing of personal data and purpose limitation. The recommendations also address the fact that individuals need to be fully informed about the processing of their personal data and can effectively exercise their rights.
- EDPB adopts statement on the implementation of the PNR directive
During its March 2025 plenary meeting, the European Data Protection Board (EDPB) adopted a statement on the implementation of the Passenger Name Record Directive (PNR) in light of the Court of Justice of the EU (CJEU) judgment C-817/19.
In its second statement on the implementation of the PNR Directive, which follows the one of 15 December 2022, the Board gives further guidance to the Passenger Information Units (PIUs) on the necessary adaptions and limitations to the processing of PNR data, following the PNR judgment. PNR data is personal information provided by passengers, and collected and held by air carriers that includes the names of the passengers, travel dates, itineraries, seats, baggage, contact details and means of payment.
The statement includes practical recommendations for the national laws transposing the PNR Directive in order to give effect to the findings of the CJEU in the PNR judgment. The recommendations cover some of the key aspects of the PNR judgement such as how European countries should select the flights from which PNR data is collected, or how long PNR data should be retained. According to the Board, the retention period of all PNR data should not exceed an initial period of six months. After this period, European countries may only store PNR data as long as needed and proportionate to the objectives of the PNR Directive.
The Board is aware that some European countries have already started the adaptation process, but there is still a substantial lack of implementation efforts throughout the Member States. Therefore, in its statement, the EDPB outlines the urgent need to implement the necessary changes and to amend national laws by taking into account the PNR judgment as soon as possible.
Data protection in Poland

- Polish SA: administrative fine of 19 800 € for failure to notify a personal data breach to the supervisory authority
The issue concerns a press conference at which Prosecutor of the National Public Prosecutor's Office, and Prosecutor General - Minister of Justice, discussed the case of one of the District Prosecutor's Offices. During the conference, the Prosecutor of the National Prosecutor's Office and the Minister of Justice - Prosecutor General disclosed personal data of a person having the status of a victim in criminal proceedings and information concerning the facts of the case contained in the judgment of the district court. Among the data disclosed, in addition to information such as name and surname, there was information constituting special categories of data. Despite the fact that a personal data breach occurred in this way, the controller did not report the personal data breach to the President of the Personal Data Protection Office, nor did he notify the natural person of the breach.
In its correspondence with the President of the Personal Data Protection Office, the National Public Prosecutor's Office presented the position that the disclosure of the data was not a personal data breach, as the information was processed by the National Public Prosecutor's Office in connection with the performance of the statutory tasks of the National Public Prosecutor's Office, and this remains outside the competence of the President of the Personal Data Protection Office.
The President of the Personal Data Protection Office, Mirosław Wróblewski, imposed a fine of 19 800 € for infringement of Articles 6, 33 and 34 of the GDPR on the National Public Prosecutor's Office in connection with the breaches found. In addition, he ordered the National Public Prosecutor's Office to notify the victim, in accordance with the GDPR, of the possible consequences of the breach and of the measures, applied or proposed by the controller, to minimize the effects of the breach.
- The Personal Data Protection Office recommends caution when using DeepSeek
The Personal Data Protection Office analyses the information related to the DeepSeek R1 model and looks at the service offered by the Chinese provider in terms of whether it meets the data processing requirements applicable in the EU. For the Personal Data Protection Office, it is important whether the application of Chinese companies providing the DeepSeek chatbot operates in accordance with the principles of data processing, whether the providers are wary of the scope and purposes of processing and whether they comply with the information obligation towards their users.
DeepSeek is a chatbot based on generative artificial intelligence technology, which in January 2025 was introduced to the global market, among others as a free application. DeepSeek's core component is software designed to understand and process human conversations. The service is provided by Chinese-registered companies Hangzhou DeepSeek Artificial Intelligence Co., Ltd. and Beijing DeepSeek Artificial Intelligence Co. In just the first two weeks, 3.6 million people worldwide downloaded the software.
Taking into account the preliminary findings related to the information shared by the provider in its privacy policy, the President of the Personal Data Protection Office recommends extreme caution in the use of applications and other services offered within DeepSeek. The information contained therein shows, inter alia, that user data may be stored on servers located in China. It should be recalled that China is not one of the countries for which an adequacy decision has been issued by the European Commission and, under the current legislation, the Chinese government has extensive powers regarding access to personal data without the guarantee of protection provided under the European data protection laws.
The Personal Data Protection Office also recalls that technologies based on generative artificial intelligence are designed to process a huge amount of data that can be used for purposes that are incompatible with the user's original wish, e.g. to further train the model or for marketing purposes.
The Personal Data Protection Office is in contact with the other supervisory authorities -members of the European Data Protection Board - to research DeepSeek's activities in the EU and their impact on the protection of individuals with regard to the processing of their data. The Personal Data Protection Office exchanges information on national activities with the other supervisory authorities.
- The Personal Draft Cyber Security Strategy - Personal Data Protection Office comments
The President of UODO has provided the Minister of Digitalization with comments on the draft Cyber Security Strategy of the Republic of Poland for 2025-2029. This document will also have an impact on the sphere of human rights and freedoms. Therefore, it is important that the legal solutions in the field of cyber security are developed to the highest degree and respond to the challenges.
The presentation of the draft Strategy is viewed positively by the President of UODO. Likewise, he refers to the idea of strengthening the resilience of cyberspace by increasing the level of information protection in the public, military and private sectors. It is also important that the document focuses on promoting knowledge and good practices in protecting our data and information.
Data protection in The Netherlands

- AP: AI chatbot apps for friendship and mental health lack nuance and can be harmful
Most AI chatbot apps for virtual friendship and mental health therapy give unreliable information and are sometimes even harmful. These AI chatbots contain addictive elements, pose as real people, and may even be dangerous for vulnerable users in a crisis situation. This is the result of a study conducted by the Dutch Data Protection Authority (AP) into 9 popular chatbot apps.
Worldwide, the use of AI chatbot apps for virtual friendships (also known as ‘companion apps’) and therapeutic purposes is growing. In the Netherlands, they rank high on the list of most downloaded apps. The AP examined the AI chatbot apps for the fourth AI & Algorithmic Risk Report Netherlands (ARR), in which signals, developments, policies and regulations concerning AI and algorithms are analyzed and explained by the AP.
Many of these chatbots, which use AI, are unable to perceive nuances in conversations. The chatbots that have been tested are based on English language models and provide worse answers when chatting in Dutch. The quality of answers during conversations in English was also unreliable.
Crisis moments
The AI chatbots can produce less nuanced, inappropriate and sometimes even harmful responses to users who bring up mental problems. During crisis moments, the chatbots do not or hardly refer to resources for professional care or assistance.
Not transparent
Because of the design of these types of AI chatbot apps, users may forget that they are not talking to a human being. When asked “Are you an AI chatbot?”, most chatbots respond evasively or sometimes even deny that they are an AI chatbot.
Aleid Wolfsen, Chair of the AP, says: “These chatbots should make it clear to users that they are not talking to a real person. People should know who or what they are dealing with, they are entitled to that. Privacy legislation requires apps to be transparent about what happens with the sensitive personal data that users share in the chat. And soon, the AI Act will also require chatbots to be transparent to users about the fact that they are interacting with AI.”
‘Are you still there...?’
In these AI chatbot apps, addictive elements are often deliberately built in. So that users, for example, have longer chat sessions or purchase extras. For example, the chatbots end their response with a question to the user or pulsating dots appear on screen, making it seem as if the chatbot is writing an answer.
Low-threshold
These companion apps and therapeutic apps are offered in various app stores as virtual friends, therapists or life coaches. They are popular, for example because the threshold to start interacting with the chatbot is low or when professional therapy is not (yet) available.
Some apps offer characters that you can chat with. For example, you can choose from a virtual dream partner, a character from a movie, or sometimes even a character posing as a psychologist.
Hyper-realistic
AI technology ensures that users cannot distinguish the AI-generated conversations from real ones. Companion apps increasingly offer voice options, so that a user can also ‘call’ a chatbot in voice mode. The app's appearance then takes on the shape of a phone call screen, making it look to the user as if they are actually making a call. The AI chatbot also sounds like a real person.
Wolfsen: “Technological progress is expected to make this kind of application even more realistic. We are deeply concerned about these and future hyper-realistic applications. That is why we are committed to raising awareness and promoting responsible use of AI.”
Offers
The providers of many of these apps are commercial companies. These parties have a profit motive and gain access to a lot of personal information through these conversations. In some cases, users are offered subscriptions, products or extras, such as virtual outfits for their characters or access to other chat rooms. During conversations about mental health problems, users may also be confronted with paywalls. In that case, they have to pay or subscribe to continue the conversation.
AI Act
Since February 2025, the AI Act prohibits certain categories of manipulative and deceptive AI. These prohibitions should prevent AI systems, including chatbots, from causing significant harm to people. Developers of AI systems are obliged to assess the risks and build in safeguards to prevent prohibited use. The European Commission has recently published guidelines on the prohibitions in the AI Act.
In most European Member States, including the Netherlands, the supervisory authorities for prohibited practices still have to be designated. The final opinion on the design of supervision of the AI Act contains a recommendation to assign supervision of prohibited AI to the AP.
- Consultation meaningful human intervention in algorithmic decision-making
The Dutch Data Protection Authority (AP) is developing a practical tool for meaningful human intervention in algorithmic decision-making. To ensure this document aligns with real-world practices, the AP invites companies, organizations, experts, and stakeholders to provide input through a consultation.
Organizations are increasingly using algorithms and artificial intelligence (AI) for algorithmic decision-making. For example, for assessing credit applications or reviewing online job applications. If organizations want to make use of algorithmic decision-making, they must comply with regulations. Individuals have the right to human intervention in algorithmic decisions that affect them. The GDPR sets rules for this.
Meaningful intervention
Human intervention should ensure that decisions are made carefully and prevent people from being unintentionally excluded or discriminated against by an algorithm. This intervention should not be merely symbolic; it must meaningfully contribute to the decision-making process.
Proper implementation is key
The way meaningful human intervention is structured is crucial. For example, if an employee is hindered by time constraints or an unclear system, this may affect the decision’s outcome.
Organizations often have questions about how to implement this effectively. Therefore, the AP is developing a tool for meaningful human intervention in algorithmic decision-making. To ensure these guidelines align with real-world practices, the AP is seeking input from businesses and organizations through a consultation.
The AP provides examples and an overview of questions that can help organizations when implementing meaningful human intervention. These questions and examples address the relevant factors humans, technology, design, and processes.
Your input is welcome
The AP invites organizations, experts, and relevant stakeholders to participate in the consultation. We are interested in real-world experiences. Have you found an approach that works, or are you facing challenges?
Your input will help refine the document. A summary of the responses will be published, but names, organizations, and contact details will not be disclosed.
Would you like to contribute? Please send an email to ppa@autoriteitpersoonsgegevens.nl. You can also reach out with questions about the document or the consultation process. We welcome your input by April 6, 2025.
Data protection in Spain

- Lorenzo Cotino Hueso and Francisco Pérez Bes, appointed president and deputy, respectively, of the Spanish Agency for Data Protection
Lorenzo Cotino Hueso and Francisco Pérez Bes have been appointed, respectively, president and deputy, of the Spanish Agency for Data Protection (AEPD), appointments that follow an evaluation of their merit, capacity, competence and suitability by a selection committee and after having obtained the favorable vote of the Congress of Deputies.
Lorenzo Cotino Hueso holds a PhD and a degree in Law from the University of Valencia, where he has been a lecturer for thirty years, a master's degree in fundamental rights in Barcelona from ESADE, a degree and a diploma in Advanced Studies in Political Science from the UNED and a Professor of Constitutional Law from the University of Valencia (2017).
He has received national research awards, as well as the Extraordinary Doctorate Award or recognition as doctor honoris causa or honorary professor from foreign universities. He is the author of 14 monographs, coordinator of 26 others and has written more than two hundred scientific articles or chapters, most of them related to privacy, data protection and transparency. He has also directed and managed some twenty research projects in these areas and has coordinated the network of specialist www.derechotics.com since 2024.
He has been deputy judge of the High Court of Justice of the Valencian Community (administrative litigation 2000‒2019), member of the Transparency Council of the Valencian Community and vice-president of the Open Government Forum of Spain.
He has also participated in consulting activities for the Administration related to compliance with the AI Regulation and the preparation of an AI sandbox in Spain, with similar experiences in different companies under agreements signed between them and the University of Valencia. Since 2020 he has been director of privacy of OdiseIA, the Observatory of the Ethical and Social Impact of Artificial Intelligence.
For his part, Francisco Pérez Bes holds a degree in Law and has a master's degree in International Business Law, a postgraduate degree in corporate finance and an IP&IT programme from ESADE. He holds a research proficiency granted by the UNED and, at present, is a PhD student at the USAL. He has completed the course for security chiefs of the protection service and control bodies taught by the National Security Office.
He has been general secretary and data protection delegate of the National Cybersecurity Institute of Spain (INCIBE) between 2014 and 2019 and was Compliance Officer at Ladbrokes Betting & Gaming between 2011 and 2014. He has management experience in different companies and entities linked to the legal field and has been a partner of Digital Law in Ecix Group until today.
He has received various awards such as the medal for cyber defense awarded by the Spanish Association of Computer Experts, the medal for merit with white badge of the Civil Guard and the medal for merit of the Lawyers of Castilla y León. He has extensive teaching experience and has published works and articles on data protection, cybersecurity and other related subjects.
Favorable vote of Congress:
Article 48 of the Organic Law on Data Protection and Guarantee of Digital Rights establishes that the presidency of the Spanish Data Protection Agency and its deputy shall be appointed by the Government, on a proposal from the Ministry of Justice, among persons of recognized professional competence, in particular in the field of data protection.
For its part, Article 20 of the Statute of the Spanish Data Protection Agency establishes that the proposal of the most suitable candidates must be made by a selection committee after evaluating their merit, capacity, competence and suitability.
Once this proposal was made, the Government submitted it to the Congress of Deputies, being ratified by the Congressional Justice Commission on February 19 in a public vote by an absolute majority.
The terms of office of the President and Deputy of the Spanish Data Protection Agency have a duration of five years and may be renewed for another period of equal duration.
Data protection in Mexico

- INAI guides representatives of public institutions to comply with their transparency obligations
On the day “Doubts and guidance on information uploading” for federal regulated entities, the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) urged public institutions to comply with transparency obligations. January marks the beginning of the period for uploading to the National Transparency Platform the information generated by the obligated subjects in the fourth quarter of 2024. INAI Commissioner and Coordinator of the Institute's Information Technology Commission, Norma Julieta del Río Venegas, said that this month public servants must comply with the uploading of information to the Transparency Obligations Portal System (SIPOT) of the National Transparency Platform (PNT) and guarantee that it will be available as of February 1, 2025.
- INAI hands over the baton of international organizations
The National Institute for Transparency, Access to Information and Personal Data Protection (INAI) held a working meeting with a delegation from the National Privacy Commission of the Philippines, the authority that assumed the Secretariat of the Global Privacy Assembly (GPA). The purpose of the meeting was to share NACI's experience in leading the GPA, the international forum of data protection authorities, regulatory agencies and other stakeholders for the exchange of best practices and regulatory approaches. NACI served as Chair and Secretariat of the GPA until October 2024, when it handed over the baton to the authorities of Bulgaria and the Philippines, respectively. NACI also handed over the chairmanship of the International Conference of Information Commissioners (ICIC) to Albania on December 6, 2024, and of the Integrity Network to Italy on December 9. NACI had been re-elected to chair ICIC for a further three years. The sixth ordinary session of the ICIC Executive Committee approved NACI's proposal that, on an interim basis, the Albanian Guarantor Body would serve as ICIC Chair and Secretariat until the ICIC members reconvene.
- The INAI Plenary and Buen Gobierno agree on an orderly and legal transfer of functions
The Plenary of INAI met this morning with the Anti-Corruption and Good Governance Secretary, Raquel Buenrostro Sánchez, to agree on an orderly and efficient transfer of functions in strict compliance with the law. Part of the work of the National Institute of Transparency, Access to Information and Protection of Personal Data (INAI) will be absorbed by the Anticorruption and Good Governance Secretariat, as a result of the reforms that extinguish INAI as an autonomous body guaranteeing the rights of access to information and protection of personal data. The changes will take place once the Congress of the Union approves the secondary laws that regulate the constitutional changes. INAI's Commissioner President, Adrián Alcalá Méndez, and Commissioners Norma Julieta del Río Venegas, Blanca Lilia Ibarra Cadena and Josefina Román Vergara agreed with Secretary Buenrostro Sánchez on the importance of coordination to guarantee a transfer that contemplates both responsible parties and specific deadlines and in which priority is given to substantive matters related to the rights protected by INAI and administrative, legal and technological aspects.
- Delete personal data from the devices you take to the Recycling Center
Electronic devices such as smartphones, tablets, computers, printers, cameras and video cameras, MP3 players, video game consoles and hard drives may contain personal data whose loss risks your privacy if you do not securely erase the information before discarding them. On January 30 and 31, the Secretary of the Environment of the Government of Mexico City will hold the Reciclatrón, a day of collection of electronic and electrical waste for recycling. Those interested can bring the devices they no longer want to the parking lot of the Universum Museum of the UNAM, located in the Circuito del Centro Cultural Universitario, Coyoacán, between 9 am and 4 pm. Before taking your devices it is important that you check the information they contain and perform a secure deletion to prevent anyone from accessing confidential information that could be used to defraud you, extort you or steal your identity.
Data protection in India

- Key Highlights of India's Draft Digital Personal Data Protection Rules, 2025
On January 3, 2025, India’s Ministry of Electronics and Information Technology ("MeitY") released the Draft Digital Personal Data Protection Rules, 2025 ("Draft Rules") for public comment. The primary aim of these Draft Rules is to operationalize the 2023 Digital Personal Data Protection Act (the "Act") and ensure robust protection and privacy of personal data in the digital realm. Below, we highlight the most notable provisions of the Draft Rules.
• Notice for Consent: To obtain informed consent from a Data Principal, a Data Fiduciary must provide the Data Principal with a clear and standalone notice outlining what data is to be collected, the purpose for the processing, and how consent can be withdrawn.
• Consent Managers and Rights of Data Principals: Defined under the DPDP Act, a Consent Manager is registered with the Data Protection Board and serves as a single point of contact for Data Principals to give, manage, review, and withdraw consent through a transparent and secure platform. Data Fiduciaries and Consent Managers must clearly publish on their website or app the process for Data Principals to exercise their rights under the Act, including the right to request access to or deletion of their personal data.
• Security Safeguards: Data Fiduciaries must implement adequate security measures to protect personal data, such as encryption, access control, monitoring for unauthorized access, and data backups. Contracts between Data Fiduciaries and Data Processors must also ensure that security measures are in place to prevent data breaches.
• Data Breach Notification: In the event of a breach, Data Fiduciaries must promptly notify affected Data Principals, including an explanation of the nature, extent, and timing of the breach. Within 72 hours, Data Fiduciaries must additionally notify the Data Protection Board of the breach, including the events that led to the breach, actions taken to mitigate risks, and the identity of the individual responsible, if known.
• Data Retention: Certain e-commerce entities, online gaming intermediaries, and social media platforms with a significant number of registered users in India must delete personal data within a specified period of time unless the user actively maintains their account. Generally, these entities may only retain personal data for up to three years from the date of a user’s last interaction.
• Processing Personal Data Of Children: A Data Fiduciary must implement measures to ensure that the person providing consent for a child’s data processing is the child’s parent or legal guardian, and that the parent or guardian is identifiable. Certain Data Fiduciaries, such as healthcare providers or educational institutions, may be exempt from specific obligations when processing children’s data, under defined conditions.
• Data Protection Impact Assessments (DPIAs): If the Central Government identifies an entity as a Significant Data Fiduciary based on certain enumerated factors, including volume and sensitivity of the data processing, that entity must conduct annual DPIAs to assess risks associated with their data processing activities.
• Cross-Border Data Transfers: Data Fiduciaries processing data within India, or in connection with providing goods or services to Data Principals from outside India, must adhere to any requirements established by the Central Government regarding the transfer of personal data to foreign states or their entities.
- Impact of India’s data protection framework on policy and business
In 2023, India embarked on a new era of data protection by enacting the Digital Personal Data Protection Act, 2023 (DPDP Act). Subsequently, the government released the Draft Digital Personal Data Protection Rules, 2025 (draft DPDP rules) in January 2025 for stakeholder consultation, which concluded on 5 March 2025.
The DPDP Act has been signed into law by the president of India but is not yet in force. Until it is enforced, the 2011 SPDI (sensitive personal data or information) rules will govern data protection in India. The Ministry of Electronics and Information Technology (MeitY) is considering a two-year transition period for businesses to align with the DPDP Act and forthcoming rules.
Some of the recent statements of the MeitY minister suggest that while the government expects businesses to start aligning their policies and practices with the DPDP Act, this is easier said than done. This is because the details of various provisions of the DPDP Act will not only be clarified through the upcoming rules, which are yet to be finalized, but the government and the Data Protection Board of India (DPB) are also expected to mandate even more operational aspects. Until the draft DPDP rules are finalized and the DPDP Act comes into force, businesses would also have to continue complying with the SPDI rules while ensuring that their policies and practices are ready for compliance with the DPDP Act and the rules, when they come into effect.
Compliance with one regulation and the road to readiness to adhere to another appears feasible, particularly considering a two-year transition period. However, the reality presents a stark contrast. The SPDI rules provide merely a foundational framework for data protection in India, characterized by less rigorous requirements and inconsistent enforcement, thereby complicating the transition to compliance with the DPDP Act. Furthermore, the DPDP Act, in conjunction with the anticipated DPDP rules, in certain aspects yields to sector-specific laws and functions as a basic data protection framework, enabling sectoral regulations to impose additional or more stringent requirements, especially concerning the cross-border transfer of personal data. The necessity to align with other regulations may lead to substantial operational and logistical hurdles, affecting the ease of doing business, particularly for startups and smaller entities, which may lack the infrastructure and resources to implement and maintain compliance protocols effectively.
The DPDP Act’s principles-based approach presents additional challenges at the policy and operational levels.
Potential policy uncertainty. The provisions of the DPDP Act leave room for policy uncertainty, especially when read with requirements under sectoral laws. For instance, in 2020, the government banned certain Chinese mobile apps that were “stealing and transmitting users’ data in an unauthorized manner” outside India. However, now the position is marked by an unexplained shift in the government’s stance towards Chinese technology. This is illustrated by recent comments made by the MeitY minister concerning hosting the Chinese open-source AI model, DeepSeek, on Indian servers to alleviate concerns surrounding data security and privacy.
The DPDP Act authorizes the central government to restrict personal data transfer to foreign countries or territories. It mandates that such cross-border data transfers adhere to stringent protections outlined in sectoral laws. The draft DPDP rules further stipulate that any data transfer, within and outside India, would have to comply with restrictions articulated in general or special orders by the central government, particularly regarding accessibility to foreigners or entities linked to foreign states.
However, the legislation’s lack of detailed safeguards and reliance on executive discretion for cross-border data transfers risk arbitrary decision making and frequent regulatory changes. This can create policy uncertainty, undermining confidence in the legal framework. The potential unpredictability of these notifications and shifting policies impose compliance challenges and elevate regulatory risks for businesses and stakeholders.
Another provision that potentially creates policy uncertainty is the clause in the draft DPDP rules concerning the notification of personal data breaches. According to the clause, a data fiduciary must inform each affected data principal and the DPB of any personal data breach “without delay”. Subsequently, the data fiduciary is required to submit a detailed report to the DPB within 72 hours. The term “without delay” remains undefined in the draft DPDP rules. This lack of specified initial reporting timelines may conflict with existing regulations mandated by the CERT-In, further contributing to policy uncertainty and unnecessary duplication of efforts in high-risk cases.
Challenges in notifying personal data breaches. While the draft DPDP rules and the DPDP Act require the reporting of all personal data breaches to the DPB and the affected data principal, they do not provide any threshold or criteria for reporting such breaches, obliging data fiduciaries to report every instance of a personal data breach, no matter how trivial. This requirement, lacking guidance on the nature, scale and risk associated with an incident, may result in excessive reporting to the DPB, hampering its ability to mitigate the risks arising from breaches effectively. Furthermore, this stipulation could misrepresent system failures as personal data breaches, adding to confusion and potentially overwhelming affected individuals and the DPB, ultimately causing reputational damage.
The personal data breach reporting requirement also does not consider other sectoral regulations such as the CERT-In Directions on Cybersecurity and the Telecommunications (Telecom Cyber Security) Rules, 2024, issued under the Telecom Act, 2023, which mandate the notification of similar data breaches to various regulatory authorities. This oversight stems from a silo-based approach by government authorities who should have engaged in a comprehensive whole-of-government exercise to establish a single-window regulatory pathway for reporting personal data breaches. Multiple “reporting nodes” are likely to exacerbate vulnerabilities rather than mitigate them.
Vague data localization requirements. Any data localization mandate is likely to conflict with various requirements under foreign laws regarding the disclosure or transfer of personal data to foreign government agencies. The draft DPDP rules propose a data localization requirement for significant data fiduciaries, whereby certain personal datasets and traffic data are prohibited from being transferred outside India based on the recommendations of a committee established by the central government. This represents a shift from the government’s previous stance of moving away from strict data localization requirements.
Arbitrary powers for requisition of information. The power to call for information from any data fiduciary and the intermediary has been mandated without any procedural safeguards, and appears to contradict the Supreme Court decision in the landmark case of Justice KS Puttaswamy (Retd) & Anr v Union of India & Ors (2017), which recognized the fundamental right to privacy guaranteed under Article 21 of the Indian Constitution. The judgment further identified that any request for disclosure of data that violates the right to privacy would have to meet three requirements:
1. the action must be sanctioned by law;
2. the proposed action must be necessary in a democratic society for a legitimate aim; and
3. the action must be proportional, ensuring a rational nexus between the objects and the means adopted to achieve them.
The draft DPDP rules and the DPDP Act fail to consider these three requirements and instead endow the government with wide discretionary powers to call for any information. This power is vague and arbitrary as it lacks any procedural safeguards or basis for seeking this information. The DPDP Act does not even provide the option or recourse to the data fiduciaries/intermediaries to object to or challenge the request for information.
While the draft DPDP rules propose to seek information for certain identified purposes such as security of the state, performance of a statutory function, etc., these purposes are worded broadly. This gives excessive powers to the government to seek any information and creates an imbalance favoring the government over not only individual privacy rights but the ease of doing business.
To conclude, while the DPDP Act and the draft DPDP rules mark significant steps in providing a dedicated data protection legal framework and aligning India with global privacy standards, several ambiguities and operational challenges remain. While some of these challenges flow from the DPDP Act itself, certain issues can be resolved through a well thought out execution of the DPDP rules, which would be essential for striking the right balance between individual privacy rights, security, innovation and making the system business friendly.
Data protection in Brazil

- Biometric Data Collection by Tools for Humanity
Due to recent news about the collection of biometric data by the company Tools for Humanity - TFH, which is responsible for manufacturing the advanced camera (Orb), used to collect data from the iris, face and eyes of the holders, with the aim of developing a “ unique human condition verification system ”, known as World ID, the National Data Protection Authority (ANPD) clarifies:
On November 11, 2024, the General Coordination of Inspection of the National Data Protection Authority initiated inspection proceedings no. 00261.006742/2024-53 against the company to investigate the processing of users' biometric data in the context of the World ID project.
Biometric personal data , such as palm prints, finger prints, retina or iris of the eyes, facial shape, voice and gait constitute sensitive personal data. Due to the higher risks that the processing of this type of personal data may pose, the legislator has granted them a more rigorous protection regime, limiting the legal hypotheses that authorize their processing.
As part of the Inspection Process, the General Inspection Coordination requested TFH to provide clarifications regarding the following aspects of the processing of personal data:
• the context in which the processing activities occur;
• the material aspects of processing operations;
• the legal hypothesis that underpins the data processing;
• the transparency of the processing of personal data;
• the exercise of rights by holders of personal data;
• the assessment of possible consequences of the processing of personal data in relation to the rights to privacy and data protection of the data subjects;
• the processing of personal data of children and adolescents; and
• existing information security and personal data protection measures.
The General Coordination of Inspection also requested the Record of Personal Data Processing Operations and the Report on the Impact on the Protection of Personal Data.
- ANPD redesigns the Standardization page on the portal
The National Data Protection Authority (ANPD) has launched a new page on its website dedicated to standardization. The objective is to clarify the Authority's regulatory process and make available the documents produced during the regulatory process cycle, namely: the Regulatory Agenda, the Regulatory Impact Analysis, Social Participation, and Regulatory Monitoring, which includes the Assessment of Regulatory Outcome and the Monitoring of the Regulated Environment.
In an increasingly dynamic, complex and innovative digital environment, the ANPD regulatory process plays a crucial role in ensuring the country's economic and technological development while guaranteeing the preservation of fundamental rights to the protection of personal data and individual privacy.
To this end, ANPD has been strengthening its regulatory process in line with international best practices, promoting greater predictability, transparency, social participation, innovation and effectiveness, based on information and evidence, with the aim of building a more confident, modern and competitive regulatory environment in Brazil.
In addition to the content, the design of the page was also considered. The new interface is more intuitive and has improved features, highlighting the Authority's efforts to ensure transparency and accountability in the performance of its functions.
- ANPD warns about misinformation involving compensation for data leaks
The National Data Protection Authority (ANPD) hereby clarifies the false news about compensation owed by the government to beneficiaries of the extinct Auxílio Brasil program, due to an alleged leak of data from the Single Registry for Social Programs (CadUnico). The Ministry of Development and Social Assistance, Family and Fight against Hunger (MDS) also reported on this misinformation.
We would like to clarify that the ANPD is the entity responsible for ensuring compliance with the General Personal Data Protection Law – LGPD (Law No. 13,709, of August 14, 2018) in Brazil and, to this extent, guaranteeing due protection of the fundamental rights of freedom, privacy and free development of the personality of individuals.
In the event of a data leak, if the source is known, the data subject must contact the organizations controlling the data directly to inquire whether their data was in fact exposed, as well as which data, specifically, was affected and what measures were taken. Clarifications may be requested regarding the use of their personal data, security procedures and possible leaks or sharing with other organizations or third parties.
- In Spain, ANPD participates in a global forum and talks with the country's data protection agency with a view to extending a memorandum of understanding
In Madrid, an agreement between the two countries seeks to develop and disseminate good practices in data protection. In Barcelona, a forum on mobile telecommunications and the digital economy brought together representatives from 150 countries
The CEO of the National Data Protection Authority (ANPD), Waldemar Gonçalves, had a meeting in Madrid on Friday (7) with representatives of the Spanish Data Protection Agency (AEPD), Lorenzo Cotino. Since 2021, the ANPD has maintained a memorandum of understanding with the AEPD, which is currently under renewal.
In addition to extending the partnership between the agencies , the meeting also addressed the development of joint actions and the promotion, dissemination and practical application of data protection standards . The discussion also included the search for greater convergence of standards between the two countries, the exchange of specialized knowledge, as well as the establishment of practices that strengthen the respective technical bodies and promote legal assistance mechanisms.
- ANPD participates in the 63rd Meeting of the Bureau of Convention 108 of the Council of Europe
On March 19 and 20, 2025, the National Data Protection Authority (ANPD) participated in the 63rd Meeting of the Bureau of the Advisory Committee of Convention 108 of the Council of Europe, held at the Paris office of the international organization.
The institution was represented by Lucas Anjos, from the International Affairs Coordination (CGRII) , who presented to the Bureau the recent regulatory developments of the ANPD, such as the proceduralizing of international data transfer mechanisms and specific inspection cases.
The meeting brought together experts and representatives of data protection authorities to discuss essential topics related to the global governance of personal data protection.
- ANPD publishes results of selection of consultancy for AI regulatory sandbox
The University of São Paulo (USP) was chosen to assist in the design of the collaborative experiment between the regulator and regulated entities.
The National Data Protection Authority (ANPD) has selected the University of São Paulo (USP) as a consultant for the Artificial Intelligence regulatory sandbox project. In the selection process, the institution obtained 69 points out of a maximum of 90. The second place went to the Federal University of Rio Grande do Sul, with 66 points; followed by the University of Brasília Foundation, with 47. The selection was approved this Monday (24), after the end of the appeal period. There was no change in relation to the initial result.
The Temporary Evaluation Committee, formed by members of the ANPD and the United Nations Development Program (UNDP), evaluated the following criteria: Qualification and Experience of the Institution, Qualification and Experience of the Technical Team, and Work Plan, Methodology and Approach.
A regulatory sandbox is a collaborative experimentation between regulators, regulated entities and other stakeholders, such as technology and innovation companies, academics and civil society organizations. The objective is to test innovations within a regulatory framework, adopting a structured and secure methodology. The initiative led by the Authority is the result of a partnership signed with UNDP lasting 20 months.
In the case of the ANPD project, it is a tool that will bring results such as a possible previously tested regulation on the subject, increased algorithmic transparency and the promotion of responsible innovation in AI.
If you have any questions, please send us an email to datasecurity@catts.eu
Tags
Share
How can we help?
CATTS is your dedicated partner for comprehensive data protection and compliance solutions. From strategic guidance and customized training to data security assessments and regulatory monitoring, we empower businesses for ethical success in the digital age. Whether it's GDPR compliance, Privacy Impact Assessments, or incident response, CATTS ensures tailored strategies to your unique data protection needs.
Contact Us