As 2024 drew to a close, global regulators took significant steps in data protection. Updates from the European Union, Poland, the Netherlands, Spain, Mexico, India, and Brazil covered a wide range of topics, including GDPR enforcement, AI governance, international data transfers, and practical guidance for safeguarding personal data on phones, social media, and during travel. In this edition of CATTS Data Protection’s Quarterly News for FY24/04, we explore these developments and their implications as we move into 2025.
- Data protection in the EU
- Data protection in Poland
- Data protection in The Netherlands
- Data protection in Spain
- Data protection in Mexico
- Data protection in India
- Data protection in Brazil

Data protection in the EU

- EDPB adopts Opinion on processors, Guidelines on legitimate interest, Statement on draft regulation for GDPR enforcement, and work programme 2024-2025
09 October - During its latest plenary, the European Data Protection Board (EDPB) adopted an Opinion on certain obligations following from the reliance on processor(s) and sub-processor(s), Guidelines on legitimate interest, a Statement on laying down additional procedural rules for GDPR enforcement and the EDPB work programme 2024-2025.
First, the EDPB adopted an Opinion on certain obligations following from the reliance on processor(s) and sub-processor(s) following an Art. 64(2) GDPR request to the Board by the Danish Data Protection Authority (DPA). Art. 64(2) GDPR provides that any DPA can ask the Board to issue an opinion on matters of general application or producing effects in more than one Member State.
The Opinion is about situations where controllers rely on one or more processors and sub-processors. In particular, it addresses eight questions on the interpretation of certain duties of controllers relying on processors and sub-processors, as well as the wording of controller-processor contracts, arising in particular from Art. 28 GDPR.
The Opinion explains that controllers should have the information on the identity (i.e. name, address, contact person) of all processors, sub-processors etc. readily available at all times so that they can best fulfil their obligations under Art. 28 GDPR. Besides, the controller’s obligation to verify whether the (sub-)processors present ‘sufficient guarantees’ should apply regardless of the risk to the rights and freedoms of data subjects, although the extent of such verification may vary, notably on the basis of the risks associated with the processing.
The Opinion also states that while the initial processor should ensure that it proposes sub-processors with sufficient guarantees, the ultimate decision and responsibility on engaging a specific sub-processor remains with the controller.
The EDPB considers that under the GDPR the controller does not have a duty to systematically ask for the sub-processing contracts to check if data protection obligations have been passed down the processing chain. The controller should assess whether requesting a copy of such contracts or reviewing them is necessary for it to be able to demonstrate compliance with the GDPR.
In addition, where transfers of personal data outside of the European Economic Area take place between two (sub-)processors, the processor as data exporter should prepare the relevant documentation, such as relating to the ground of transfer used, the transfer impact assessment and the possible supplementary measures. However, as the controller is still subject to the duties stemming from Art. 28(1) GDPR on ‘sufficient guarantees’, besides the ones under Art. 44 to ensure that the level of protection is not undermined by transfers of personal data, it should assess this documentation and be able to show it to the competent Data Protection Authority.
- EDPB selects topic for next year’s Coordinated Action
10 October - During its October 2024 plenary, the European Data Protection Board (EDPB) selected the topic for its fourth Coordinated Enforcement Action (CEF), which will concern the implementation of the right to erasure (‘right to be forgotten’) by controllers. Data Protection Authorities (DPAs) will join this action on a voluntary basis in the coming weeks and the action itself will be launched during the first semester of 2025.
The right to erasure (Art.17 GDPR) is one of the most frequently exercised data protection rights and one about which DPAs frequently receive complaints. The aim of this coordinated action will be, among other objectives, to evaluate the implementation of this right in practice. For example, this will be done by analysing and comparing the processes put in place by different controllers to identify the most important issues in complying with this right, but also to get an overview of best practices.
In a coordinated enforcement action, the EDPB prioritises a specific topic for DPAs to work on at national level. In the past three years, DPAs have already coordinated their national actions on different topics, namely: the use of cloud in the public sector, the designation and position of Data Protection Officers and the implementation of the right of access by data controllers.
The results of these national actions are then aggregated and analysed together to generate deeper insight into the topic and allowing for targeted follow-up on both national and EU level.
- EDPB meets with adequate countries
On 8 October 2024, the European Data Protection Board met with Commissioners and representatives of Data Protection Authorities (DPAs) from the fifteen countries having been subject to an EU adequacy decision. The meeting took place in the margins of the EDPB October’s plenary and reflects the EDPB’s commitment to international engagement.
The European Commission has so far recognised the following adequate countries: Andorra, Argentina, Canada, Faroe Islands, Guernsey, Israel, Isle of Man, Japan, Jersey, New Zealand, Republic of Korea, Switzerland, United Kingdom, Uruguay and United States.
Adequacy decisions are the result of a high degree of convergence of data protection laws and enable safer data flows.
During the meeting, the EDPB and the DPAs from the adequate countries discussed multilateral engagement on advisory work and guidelines, and on enforcement cooperation.
- Report on Article 36 alerts of Schengen Information System decision
The members of the Schengen Information System II Supervision Coordination Group (SIS II SCG) and the Coordinated Supervision Committee (CSC) conducted a coordinated inspection activity concerning the Schengen Information System (SIS). The SIS is the most widely used and largest information sharing system for security and border management in Europe.
Within the scope of the inspections, the alerts and procedures with regard to Article 36 of the SIS II Decision were checked, as there has been a Europe-wide increase in the number of alerts in this alert category. In order to achieve comparable results, a questionnaire was developed by the SIS SCG and used by the Member States for their inspections.
The results indicated various differences between the nineteen Member States which took part in the coordinated inspection activity, enabling the CSC to draw conclusions and to recommend the necessary measures with regard to the documentation of the alerts, their quality, the time limits and retention period, as well as substantive and technical issues.
- EDPB stakeholder event AI models
The EDPB is holding a stakeholder event on “AI models” with participants representing European sector associations, organisations, NGOs, individual companies, law firms and academics.
During today’s event, the EDPB will collect input for of the preparation of a consistency opinion on AI models, requested by the Irish Data Protection Authority under Art. 64 (2) GDPR.
EDPB Chair Anu Talus said: “During the stakeholder event we will tackle a number of targeted questions, which will feed our reflection in the context of the preparation of our Opinion on AI models. Stakeholder input is especially valuable for these fast-moving technologies with an exceptional societal impact.”
The EDPB's opinion on “AI models” is due by the end of 2024.
- EDPB adopts its first report under the EU-U.S. Data Privacy Framework and a statement on the recommendations on access to data for law enforcement
During its latest plenary, the European Data Protection Board (EDPB) adopted a report on the first review1 of EU-U.S. Data Privacy Framework (DPF), as well as a statement on the recommendations of the high-level group (HLG)2 on access to data for effective law enforcement.
The EDPB welcomes the efforts by the U.S. authorities and the European Commission to implement the DPF, and takes note of several developments that took place since the adoption of the adequacy decision in July 2023.
Regarding commercial aspects, i.e. the application and enforcement of requirements applying to companies self-certified under this framework, the EDPB notes that the U.S Department of Commerce took all relevant steps to implement the certification process. This includes developing a new website, updating procedures, engaging with companies, and conducting awareness-raising activities.
In addition, the redress mechanism for EU individuals has been implemented and there is comprehensive complaint-handling guidance published on both sides of the Atlantic. However, the low number of complaints received so far under the DPF highlights the importance of having U.S. authorities initiate monitoring activities concerning compliance of DPF-certified companies with the substantive DPF Principles.
The EDPB encourages the development of guidance by U.S. authorities, clarifying the requirements that DPF-certified companies would need to comply with when they transfer personal data that they have received from EU exporters. Guidance by U.S. authorities on human resources data would also be welcome. The EDPB expresses its availability to provide feedback on these guidance documents.
Concerning the access by U.S. public authorities to personal data transferred from the EU to certified organisations, the EDPB focused; on the effective implementation of the safeguards introduced by the Executive Order 14086 in the U.S. legal framework, such as the necessity and proportionality principles and the new redress mechanism. The Board considers that the elements of the redress mechanism are in place; at the same time, it renews the call to the European Commission to monitor the practical functioning of the different safeguards, e.g. the implementation of the principles of necessity and proportionality. The EDPB also recommends that the Commission monitors future developments related to the U.S. Foreign Intelligence Surveillance Act, in particular given the extended reach of Section 702 after its re-authorisation by the U.S. Congress earlier this year.
Finally, the Board recommends that the next review of the EU-U.S. adequacy decision should take place within three years or less.
The statement on the recommendations of the HLG on access to data for effective law enforcement underlines that fundamental rights must be safeguarded when law enforcement agencies access the personal data of individuals. While the EDPB supports the aim of effective law enforcement, it points out that some of the HLG’s recommendations could cause serious intrusiveness vis-à-vis fundamental rights, in particular the respect for privacy and family life.
While the EDPB positively notes the recommendation may lead to the establishment of a level-playing field on data retention, it considers that a broad and general obligation to retain data in electronic form by all service providers would create a significant interference with the rights of individuals. Therefore, the EDPB questions whether this would meet the requirements of necessity and proportionality of the Charter of Fundamental Rights of the EU and the CJEU jurisprudence.
In its statement, the EDPB also emphasizes that the recommendations concerning encryption should not prevent its use or weaken the effectivity of the protection it provides. For example, the introduction of a client-side process allowing remote access to data before it is encrypted and sent on a communication channel, or after it is decrypted at the recipient, would in practice weaken encryption. Preserving the protection and effectivity of encryption is important to avoid that the respect for private life and confidentiality is negatively impacted and to ensure that the freedom of expression and economic growth, which depend on trustworthy technologies, are safeguarded.
- EDPB clarifies rules for data sharing with third country authorities and approves EU Data Protection Seal certification
During its latest plenary, the European Data Protection Board (EDPB) published guidelines on Art.48 GDPR about data transfers to third country authorities and approved a new European Data Protection Seal.
EDPB helps organisations assess data transfer requests by third country authorities
In a highly interconnected world, organisations receive requests from public authorities in other countries to share personal data. The sharing of data can, for instance, be of help to collect evidence in the case of crime, to check financial transactions or approve new medications.
When a European organisation receives a request for a transfer of data from a ‘third country’ (i.e. non-European countries) authority, it must comply with the General Data Protection Regulation (GDPR). In its guidelines, the EDPB zooms in on Art. 48 GDPR and clarifies how organisations can best assess under which conditions they can lawfully respond to such requests. In this way, the guidelines help organisations to make a decision on whether they can lawfully transfer personal data to third country authorities when asked to do so.
Judgements or decisions from third countries authorities cannot automatically be recognised or enforced in Europe. If an organisation replies to a request for personal data from a third country authority, this data flow constitutes a transfer and the GDPR applies. An international agreement may provide for both a legal basis and a ground for transfer. In case there is no international agreement, or if the agreement does not provide for an appropriate legal basis or safeguards, other legal bases or other grounds for transfer could be considered, in exceptional circumstances and on a case by case basis.
Approval of EU Data Protection Seal
During the plenary meeting, the Board also adopted an opinion approving the Brand Compliance certification criteria concerning processing activities by controllers or processors. In September 2023, the Board already adopted an opinion on the approval of the Brand Compliance national certification criteria, making them officially recognised certification criteria in the Netherlands for data processing by organisations. The approval of the new opinion means that these criteria will now be applicable across Europe and as a European Data Protection Seal.
GDPR certification helps organisations demonstrate their compliance with data protection law. This transparency helps people trust the product, service, process or system for which organisations process their personal data.
- EDPB calls for coherence of digital legislation with the GDPR
During its December 2024 plenary, the European Data Protection Board (EDPB) adopted a statement on the second report of the European Commission on the application of the General Data Protection Regulation (GDPR).*
In its statement, the EDPB welcomes the reports from the European Commission and the Fundamental Rights Agency**. Importantly, the EDPB underlines the importance of legal certainty and coherence of digital legislation with the GDPR, and recalls some of its ongoing initiatives to clarify the enforcement interplay of the GDPR with the AI Act, the EU Data Strategy and the Digital Services Package.
In addition, the EDPB announces it will step up the production of content for non-experts, small and medium-sized enterprises (SMEs) and other groups.
Finally, the Board highlights the genuine need for additional financial and human resources to help DPAs and the EDPB deal with increasingly complex challenges and additional competences.
- EDPB opinion on AI models: GDPR principles support responsible AI
The European Data Protection Board (EDPB) has adopted an opinion* on the use of personal data for the development and deployment of AI models. This opinion looks at 1) when and how AI models can be considered anonymous, 2) whether and how legitimate interest can be used as a legal basis for developing or using AI models, and 3) what happens if an AI model is developed using personal data that was processed unlawfully. It also considers the use of first and third party data.
The opinion was requested by the Irish Data Protection Authority (DPA) with a view to seeking Europe-wide regulatory harmonisation. To gather input for this opinion, which deals with fast-moving technologies that have an important impact on society, the EDPB organised a stakeholders’ event and had an exchange with the EU AI Office.
Regarding anonymity, the opinion says that whether an AI model is anonymous should be assessed on a case by case basis by the DPAs. For a model to be anonymous, it should be very unlikely (1) to directly or indirectly identify individuals whose data was used to create the model, and (2) to extract such personal data from the model through queries. The opinion provides a non-prescriptive and non-exhaustive list of methods to demonstrate anonymity.
With respect to legitimate interest, the opinion provides general considerations that DPAs should take into account when they assess if legitimate interest is an appropriate legal basis for processing personal data for the development and the deployment of AI models.
A three-step test helps assess the use of legitimate interest as a legal basis. The EDPB gives the examples of a conversational agent to assist users, and the use of AI to improve cybersecurity. These services can be beneficial for individuals and can rely on legitimate interest as a legal basis, but only if the processing is shown to be strictly necessary and the balancing of rights is respected.
The opinion also includes a number of criteria to help DPAs assess if individuals may reasonably expect certain uses of their personal data. These criteria include: whether or not the personal data was publicly available, the nature of the relationship between the individual and the controller, the nature of the service, the context in which the personal data was collected, the source from which the data was collected, the potential further uses of the model, and whether individuals are actually aware that their personal data is online.
If the balancing test shows that the processing should not take place because of the negative impact on individuals, mitigating measures may limit this negative impact. The opinion includes a non-exhaustive list of examples of such mitigating measures, which can be technical in nature, or make it easier for individuals to exercise their rights or increase transparency.
Finally, when an AI model was developed with unlawfully processed personal data, this could have an impact on the lawfulness of its deployment, unless the model has been duly anonymised.
Considering the scope of the request from the Irish DPA, the vast diversity of AI models and their rapid evolution, the opinion aims to give guidance on various elements that can be used for conducting a case by case analysis.
In addition, the EDPB is currently developing guidelines covering more specific questions, such as web scraping.
Data protection in Poland

- AI and cybersecurity will be the subject of the Social Insurance Institution’s and UODO's seminars
The seminar "Time of challenges – designing AI systems and implementing NIS2 in the organisation" will be held on October 9, 2024, from 10.00 a.m. to 2.00 p.m. at the headquarters of the Social Insurance Institution branch in Chorzów at 45 Generała Henryka Dąbrowskiego Street. This is the third meeting in a series of four events related to the subject of personal data protection, which are organised by the Social Insurance Institution in cooperation with the Personal Data Protection Office.
During the event, issues related to the obligations of data controllers in the context of designing artificial intelligence (AI) systems and adapting organisations in the field of cybersecurity to the requirements of the NIS2 Directive will be discussed. This event will also be an opportunity to present the possibilities and ways of using AI in public administration and other institutions.
- Data protection and state security - conclusions after the seminar of the Polish SA and the ZUS
On 7 October, a seminar on ‘Data protection as an element of the resilience of society and the state’ was held at the Social Insurance Institution Headquarters in Warsaw. During the meeting, the Personal Data Protection Office and the Social Insurance Institution, in cooperation with the Social Team of Experts by the President of the Personal Data Protection Office, presented trends and directions applicable to the protection of personal data as an element of the resilience of society and the state.
Topics covered at the event included new types of cyber-attacks, the impact of warfare on the number of data protection breaches or new methods of phishing and social engineering attacks in the context of Russia's aggression against Ukraine and the conflict in the Middle East.
There is a reason why this conference took place in October - European Cybersecurity Month. It is a campaign organised by ENISA, the European Union Agency for Cybersecurity, at the initiative of the European Commission. Its aim is to popularise knowledge, raise awareness and exchange good practices in the area of cyber security among Internet users, professionals or those involved in the education of children and young people. In this year's edition, special attention was paid to the issue of social engineering - the various methods of social engineering that cybercriminals use to manipulate online scams targeting web users.
The seminar was an opportunity to discuss new trends in the activities of hackers exploiting gaps in the security of systems that process personal data. Participants discussed the privacy challenges posed by new technologies such as artificial intelligence, machine learning and biometric data processing. They discussed the use of personal data leaks to disinform or interfere with democratic processes.
- Lack of appropriate technical and organisational measures may cause problems
The President of the Personal Data Protection Office has imposed fines of PLN 15,000 and PLN 20,000 on two municipal institutions in Kutno for, among other things, failure to implement appropriate technical and organisational measures resulting in a personal data breach. An unencrypted pendrive with the personal data of approximately 1,500 people was lost. A fine of more than PLN 24,000 was also received by the company servicing these institutions, in terms of changing the HR and payroll programme.
All three institutions had procedures for data safeguarding, but in the course of the work to transfer the data to the new HR and payroll system of the Municipal Social Welfare Centre (MOPS) and the Municipal Sports and Recreation Centre (MOSiR), the data was not effectively safeguarded. For the procedure itself to change the HR and payroll system at MOSiR and MOPS, there was also no risk analysis for personal data carried out.
An MOPS' employee also working for MOSiR shared the data with an employee of the company carrying out the transfer of the data. They were ripped onto a pendrive, which, however, was not encrypted. The company employee then ripped some of the data onto the company laptop. After this operation, the pendrive was not wiped, as stipulated by that company's procedure.
An employee of the company went to another city and lost this pendrive there. The person who found it first gave an announcement in the local media, and as this did not yield results, this person opened the carrier. Based on the names of the folders, the person guessed that it contained information concerning MOPS and MOSiR from Kutno and contacted them.
Thus, these institutions realised that the pendrive containing personal data had been lost. They notified it to the President of the Personal Data Protection Office. The pendrive contained the data of approximately 1,000 former and current employees and collaborators of MOSiR and the data of 549 employees, pensioners and former employees, contractors and participants of MOPS intervention works.
The scope of the data of the two institutions was different, but in total, data such as first names, surnames, parents' first names, dates of birth, bank account numbers, residence or domicile addresses, PESEL identification numbers, e-mail addresses, data on earnings and/or possessions, mother's family names, ID card series and numbers, telephone numbers, data on holidays, sick leaves, data on completed schools, employment history, children's names and their dates of birth could be found on the carrier.
The President of the Personal Data Protection Office has investigated the case and found that if a risk analysis had been carried out for the process of replacing the HR and payroll system, there would not have been a personal data breach. By its absence, no one controlled the process and no one checked whether the procedures of the company carrying out the change of the HR and payroll system were adequate.
The obligations of those involved in the processing of personal data should not end with a two-step process, i.e.
1. carrying out a risk analysis
2. and implementing appropriate technical and organisational measures to ensure the security of the personal data processed.
Both MOPS, MOSiR and the company changing the HR and payroll system should have verified that the personal data was shared in a way that took into account the risk of loss of its carrier and that it was adequately protected against unauthorised access (e.g. by using the password required to open all files or folders of files containing personal data). If this had been done, a personal data breach could have been prevented.
- President of the Polish SA at the conference on impact of disruptive technologies on personal data
Legal challenges related to the development and implementation of disruptive technologies, i.e. breakthrough technologies for the economy, such as artificial intelligence, the internet of things, robotics, quantum technology or neurotechnology, among others, were the topics discussed during the international scientific conference ‘Legal Challenges of Disruptive Technologies’. The conference took place on 7-8 November 2024 at the Leon Kozminski Academy in Warsaw. The event was attended by Mirosław Wróblewski, President of the Personal Data Protection Office.
During his speech entitled. ‘Artificial Intelligence and the Protection of Fundamental Rights in the Context of the General Data Protection Regulation’, the President of the Personal Data Protection Office discussed the various issues of the Artificial Intelligence Act and its relationship to the existing European data protection regime. He also outlined the importance of the Act in the context of ensuring and protecting fundamental rights.
‘Many public and private sector organisations implementing high-risk AI systems will be required to conduct a Fundamental Rights Impact Assessment (FRIA). The purpose of the FRIA is to enable early identification of potential threats to fundamental rights and to take appropriate measures to mitigate them,’ Miroslaw Wroblewski pointed out in the conclusion of his speech.
One of the co-organisers of the event is legal counsel Roman Bieda, a member of theSocial Team of Experts by the President by the Personal Data Protection Office. The conference also featured another member of the Social Team of Experts by the President of the Personal Data Protection Office, legal counsel PhD Dominik Lubasz, as one of the speakers.
The conference was held in English and was organised in cooperation with a number of foreign universities and research institutes.
- Failure to implement the appropriate security measures for data could result in data loss
The President of the Personal Data Protection Office fined a company selling, inter alia, burglar-proof doors, more than PLN 350 thousand for failure to comply with data protection rules. The partners of the civil partnership entrusted by the company with data processing were fined PLN 9.8 thousand.
The company notified that it had lost access to customer and employee data as a result of a hacking attack. The database contained data of, inter alia, former and current employees: identifications numbers (PESEL), ID cards, first and last names, parents' names, dates of birth, bank account numbers, home or residence addresses, email and telephone number. According to the company, its employee disabled its anti-virus programme and this enabled the ransomware attack. According to the controller, however, the incident was short-lived and the company managed to regain access to the data. It also considered that the purpose of the attack was not to obtain data, but to blackmail. Consequently it considered that there was no high risk of breaching the rights or freedoms of individuals. The company (data controller) communicated the fact to the data subjects. However, it did so in a flawed manner, and did not respond to the comments of the Polish SA.
The President of the Polish SA comprehensively considered the evidence gathered in the case. He also asked the company (data controller) what solutions it had implemented after the attack. As a result, the President of the Personal Data Protection Office found that the data controller did not implement appropriate technical and organisational measures that would mitigate the risk to the data. And this was because, contrary to the indications of the GDPR it had not carried out an adequate risk analysis. In these circumstances, the risk should have been combined with the possibility of malware. One of the key methods to prevent such attacks is to use up-to-date software for all elements of the IT infrastructure. This was not done by the company, as it failed to identify such a threat.
Regardless of the controller's failure to implement appropriate technical and organisational security measures on the basis of its risk analysis, the fine is also for: failure to verify that the processor provides sufficient guarantees to implement appropriate technical and organisational measures so that the processing meets the requirements of the GDPR protects the rights of data subjects (point I(b) of the operative part of the decision); incorrect communication to data subjects (point I(c) of the operative part of the decision).
The controller also failed to comply with the principle of accountability under the GDPR (Article 5(2) of Regulation 2016/679) both before and after the incident. At no stage of the processing of personal data did he precisely identify all identifiable risks or threats, which made the implemented security measures ineffective. The measures implemented after the attack were also inadequate: the controller was not able to demonstrate that they were appropriate to the risks because he had not examined the risks.
The controller indicated that a person (human factor) was at fault, but, by his own admission, he had only conducted two data protection training sessions. And only one before the incident. This is not enough if the controller believes that the ‘human factor’ poses a risk to data in his organisation.
The President of the Personal Data Protection Office also found misconduct on the part of the controller in notifying its former as well as current employees of a breach in the protection of their personal data.
The President of the Personal Data Protection Office also noted the liability of the partners of the civil partnership entrusted by the controller with data processing. He pointed out that they failed to assist the controller in complying with its obligation to implement adequate technical and organisational measures ensuring the security of personal data processing. Such assistance should have consisted of informing him of the lack of adequate security measures for the server used by him in the processing of personal data, irrespective of whether or not this lack resulted in its use by the perpetrators of the ransomware attack and, as in the concerned case, the occurrence of a personal data breach. The Processor neglected over the years to inform the Controller about the vulnerabilities present in the server's software (while one of them was successfully exploited by the perpetrators of the criminal action) and about the need to upgrade the operating system to the latest possible version or to use other, newer logical solutions.
- The President of the Personal Data Protection Office met with representatives of Microsoft
On November 15 this year, the President of the Personal Data Protection Office, Mirosław Wróblewski, and the Deputy President of the Personal Data Protection Office, Professor Agnieszka Grzelak, met, together with colleagues, with representatives of Microsoft, including Julie Brill, Chief Privacy Officer and Corporate Vice President of Global Privacy, Safety and Regulatory Affairs at Microsoft.
The meeting discussed issues related to the use of personal data to train AI models were, including risks to user privacy and compliance with data protection regulations. The challenges arising from the use of cloud services in terms of data protection for data transfers outside the EU, were also discussed.
Particular attention was given to the need to ensure transparency and effective supervision of user data, including the possibility of implementing new technological solutions supporting data protection. The meeting was an important step in building cooperation between the data protection authority and representatives of the IT market, for the sustainable development of technology, while respecting the right to privacy.
- PUODO's comments on the draft law on the National Register of Marked Dogs and Cats
The aim of introducing the Register is a legitimate one: to reduce animal homelessness. However, the creation of a database in which data on up to 12 million dog and cat owners can be processed should be done with limited risk to the data.
The President of UODO has submitted his comments to the Minister of Agriculture. This refers to the draft law being processed since 1 October this year on the portal of the Government Legislation Centre. The President of UODO was left out of the opinion process, but decided to comment on it ex officio, hoping to be involved in further work on the draft.
The UODO notes that the draft requires an in-depth analysis of the effects of the change in the law on the protection of personal data (the so-called privacy test). It also enumerates concerns and needed changes to the draft law.
- The qualified electronic signature certificate should not reveal the PESEL number
The President of UODO has requested the Minister of Digitalisation to amend the Act on Trust and Electronic Identification Services so that the PESEL number is not made public in the qualified electronic signature certificate.
This is another request on this issue - but the supervisory authority's demands have so far failed to produce the expected results.
The problem with the PESEL number has been signalled to the President of UODO by institutions and organisations where qualified electronic signatures are used. The PESEL number is obtained by the providers of public trust services (qualified electronic signature) and then made public, which, in turn, does not result from either European or national legislation.
The use of qualified electronic signature certificates is regulated by the eIDAS Regulation (Regulation No 910/2014 of the European Parliament and of the Council of 23 July 2014 on electronic identification and trust services for electronic transactions in the internal market), and the Trust Services Act (Act of 5 September 2016 on trust services and electronic identification).
- Use of wearable cameras by ticket controllers and the duty to inform
Poland should proceed before the Court of Justice of the EU with regard to the preliminary question of how to implement the information obligation concerning the manner in which ticket controllers process data obtained by means of body cameras. The ruling may have an impact on the application of national law.
The case in question is Case C-422/24 Storstockholms Lokaltrafik (personal data protection - right to information and access to personal data obtained by means of a cob camera). The President of the Office for the Protection of Personal Data (UODO) forwarded this opinion to Minister Agnieszka Bartol-Saurel of the Chancellery of the Prime Minister.
The request described the situation of ticket controllers equipped with body cameras. The cameras were intended to prevent threats and acts of violence, as well as to assist in verifying the identity of passengers required to pay an additional charge.
The cams used by the ticket controllers recorded video and sound. The recordings were initially automatically erased after two minutes and then after one minute. However, controllers were required to stop the deletion of the recording if they imposed a fine on a passenger or if they heard threats from the passenger. In this case, the system retained the recording started one minute before the controller stopped the deletion.
- Land registry data will be better protected
Introducing a system of authentication before entering the land registry database is a good idea, believes Miroslaw Wróblewski, president of UODO. Such a solution will block the machine extraction of personal data from land and mortgage registers and make it easier to fulfil the obligations that GDPR imposes on data controllers.
It is good that the leadership of the Ministry of Justice has recognised the threats to personal data collected in significant numbers in land and mortgage registers, writes the President of the UODO, Mirosław Wróblewski, to the Deputy Minister of Justice, Arkadiusz Myrcha, after learning about the proposals of the Ministry of Justice to increase the protection of data in land and mortgage registers, which today can be easily obtained via the Internet.
Data protection in The Netherlands

- Holiday parks adjust use of facial recognition after investigation by Dutch DPA
The Dutch Data Protection Authority (Dutch DPA) has investigated 8 holiday parks that deploy facial recognition for giving access to swimming pools and playgrounds. It turned out that all holiday parks investigated broke the privacy law. For example, by not pointing out to their guests that they could also visit the swimming pool without subjecting themselves to facial recognition. Under pressure from the Dutch DPA, 7 of the parks investigated have adjusted their way of working, but 1 holiday park has not done this so far. If this continues to be the case, the Dutch DPA may impose other measures, such as a fine or an incremental penalty.
The Dutch DPA started the investigation after receiving tip-offs from citizens. "People were surprised," says Monique Verdier, vice chair of the Dutch DPA. "Where they used to access the swimming pool using a card or a wristband, facial recognition was suddenly deployed. For adults, but also for children. Solely for gaining access to the swimming pool. Is that allowed, just like that? That is what they wanted to know."
- Facial recognition is usually prohibited
The General Data Protection Regulation (GDPR) sets strict requirements to the deployment of facial recognition. "Deployment is in principle prohibited. And that is for a reason," says Verdier. "Once such a facial scan has been made of you, you have lost control. Then you can be identified and followed everywhere. Your face is unique, and you cannot just swap it for a new face."
In principle, facial recognition is permitted in 3 cases only:
1. If the facial recognition serves only a personal or household purpose, such as unlocking a phone.
2. If the facial recognition is necessary for purposes of authentication or security. Such necessity, however, does not arise easily. It must concern a substantial public interest. For example, the security of a nuclear power plant or information that constitutes a state secret.
3. If the person whose face you scan explicitly consents to this.
- Final recommendation on supervision of AI: sector and centrally coordinated
Collaboration, coordination and the best possible use of expertise in the areas of fundamental rights and safety. Those elements are at the heart of the final recommendation 'Supervision of AI' presented by the Dutch Authority for Digital Infrastructure (RDI) and the Dutch Data Protection Authority (Dutch DPA) today. The recommendation describes how the use of AI can effectively be supervised through an integrated approach.
Artificial intelligence (AI) is developing at breakneck speed and is used on an increasingly large scale everywhere. The possibilities for application are endless, and AI offers huge opportunities for our society. At the same time, there may be substantial risks.
The AI Act contains rules for responsible development and use of AI by businesses, governments, and other organisations. Well-organised supervision gives consumers confidence and creates clarity for organisations and the business sector. They can continue to consult the (sectoral) supervisory authorities they already know.
- Netflix fined for not properly informing customers
Netflix did not give customers sufficient information about what the company does with their personal data between 2018 and 2020. And the information that Netflix did give was unclear on some points. For this reason, the Dutch Data Protection Authority (Dutch DPA) is imposing a fine of 4.75 million euro on the streaming service. Netflix has since updated its privacy statement and improved its information provision.
Netflix collects various types of personal data of customers, ranging from email addresses, telephone numbers and payment details to data about what customers watch on the platform, and when exactly.
An investigation started by the Dutch DPA in 2019 shows that Netflix did not inform customers clearly enough in its privacy statement about what exactly Netflix does with those data. Furthermore, customers did not receive sufficient information when they asked Netflix which data the company collects about them. These are violations of the General Data Protection Regulation (GDPR).
‘Must be crystal clear’
‘A company like that, with a turnover of billions and millions of customers worldwide, has to explain properly to its customers how it handles their personal data,’ Dutch DPA chairman Aleid Wolfsen says. ‘That must be crystal clear. Especially if the customer asks about this. And that was not in order.’
Too little and unclear
On several points, Netflix provided too little information to customers, or the information provided was unclear. The company was not clear enough about:
• the purposes of and the legal basis for collecting and using personal data;
• which personal data are shared by Netflix with other parties, and why precisely this is done;
• how long Netflix retains the data;
• how Netflix ensures that personal data remain safe when the company transmits them to countries outside Europe.
- Complaints from an Austrian privacy foundation
The Dutch DPA started this investigation following complaints from None of your business (noyb), an Austrian NGO that is committed to privacy. Those complaints were submitted to the Austrian data protection authority and forwarded to the Dutch DPA, because Netflix has its main European establishment in the Netherlands.
Under the GDPR, companies that process data in several EU Member States have to deal with only one data protection authority: the authority in the country in which the company has its main establishment. The Dutch DPA has coordinated the investigation and the amount of the fine with other European data protection authorities.
Data protection in Spain

- The AEPD publishes an analysis on the protection of children and adolescents in the digital environment
The Spanish Data Protection Agency (AEPD) has published'Safe Internet by default for children and the role of age verification', which analyses how children and adolescents can be protected on the Internet without this entailing surveillance and invasion of the privacy of all users, and without exposing children to being located and exposed to new risks. This analysis focuses on the obligation to comply with the data protection principles included in the General Data Protection Regulation (GDPR), together with other regulations that complement or deepen the protection of minors.
The document presents different strategies for protecting children and adolescents (NNA) on the Internet, defining different use cases: protection against inappropriate content, safe environments for children, consent for the processing of personal data and child-friendly design. Each use case analysed is subject to different regulatory frameworks and, as a common framework, to the GDPR regarding the processing of personal data.
The published analysis explains that, at present, a large part of Internet services have strategies based, at best, on reacting once it has been detected that damage or impact has already occurred. A variation of this is to enable Internet service providers to know who is a minor , such as by creating specific spaces or accounts for children and adolescents . These strategies, it adds, require an intrusive intervention in the form of surveillance or profiling that violates the privacy of all users : they allow the minor to be located and easily accessible to any malicious actor, they legitimize the processing of additional personal data of children and adolescents, they adapt messages so that they make decisions that do not correspond to them or they hide profiling purposes in relation to deceptive or addictive patterns, loyalty, hiring, consumption or monetization of personal data.
The Agency collects examples and good practices to protect minors from the risks related to access to adult content , such as contact with people who may put them in danger, the contracting of products and services, the monetization of their personal data, the induction of addictive behaviors that affect their physical or mental integrity and other aspects.
- The age verification system proposed by the AEPD receives two awards at the Global Privacy Assembly
The age verification system proposed by the Spanish Data Protection Agency (AEPD) to protect children and adolescents on the internet has been recognised with two of the five awards of the 46th Global Privacy Assembly, which brings together data protection and privacy authorities globally.
The Agency has received the awards in the Innovation and Public Award categories, the latter being selected from among the shortlisted candidates in each of the categories. These awards aim to recognise the excellence and innovation of the good practices put in place by the Authorities.
In December 2023, the Spanish Data Protection Agency presented a proposal for a system for verifying age and protecting minors on the internet from access to adult content, demonstrating that it is technically possible to protect minors from access to inappropriate content while ensuring the anonymity of adults when browsing the internet. The system consists of a Decalogue setting out the principles to be complied with by an age verification system, a technical note with the details of the project and three proofs of concept executed on Android, iPhone and Windows. This is complemented by agraph showing the risks of the age verification systems they currently use.
The Global Privacy Assembly has assessed the innovative nature of this proposal, which has had a significant impact at international level, giving it, in addition to the Innovation Award, the Public Award. The objective of this development of the Agency is to demonstrate that a guarantee system can be built that only confirms the attribute of majority, demonstrating that there is no need to establish dedicated parallel digital identity services for access to adult content and thus separate the identity of individuals from age verification.
The two awards this year awarded to the Agency by its counterpart authorities around the world add to the 31 AEPD awards that the AEPD has received in recent years for its initiatives to protect people in a digital world.
The Global Privacy Assembly is a global annual forum where independent privacy, data protection and freedom of information supervisory authorities adopt high-level resolutions and recommendations addressed to governments and international organisations. Similarly, its vision is to maintain an environment in which privacy and data protection authorities around the world can act effectively to fulfil their mandates, both individually and in a concerted manner, by disseminating knowledge and supporting connections.
In the previous edition of these prizes (2023), the Agency was recognised with the Global Privacy Assembly’s Priority Channel Prize ‘Conflict Resolution – Law Enforcement’, recognising its value as an effective tool for the swift removal of content published online in situations where the physical and psychological integrity of the individuals concerned is seriously jeopardised.
- AEPD and Atresmedia Foundation launch the ‘No to digital free bar’ campaign to alert about the risks of early mobile access
The Spanish Data Protection Agency (AEPD) and the Atresmedia Foundation join forces to warn about the dangers of minors accessing inappropriate content via mobile phones and to promote support in the use of technology in children and adolescents.
To this end, both entities launch the ‘No to the digital free bar’ campaign with the aim of recommending families to delay the delivery of the mobile phone to their children and to accompany them in their interaction with the digital world, thus preventing them from accessing content that is inappropriate and detrimental to its development, such as pornographic or violent content, among others.
In this context, the CEO of Atresmedia and employer of the Atresmedia Foundation, Javier Bardají, and the Director of the AEPD, Mar España, stress the importance of adult support in the use of technology, as well as the risks that social media can pose to the mental health of children and young people, agreeing on the need to equip minors with the skills needed to meet these digital challenges and recalling that a safe digital environment is the responsibility of all.
“All families, regardless of socio-economic or cultural factors, have lived or are going to live what gives a mobile device to their children and there is a need to address the relationship between technology, children, hyper connection and being exposed to inappropriate content. It is difficult for a child to self-regulate, because he or she does not have sufficient tools to manage it, just as he would not be able to manage the situations reflected in this campaign. For this reason, we consider it essential to delay the delivery of the mobile phone and, when it comes to doing so, to have dealt with the issue in the family, since dialogue and support’, said Mar España, Director of the Spanish Data Protection Agency.
“Big tech companies have originally designed optimised social networks to capture us for as long as possible, at the expense of our mental health and well-being, and even more severe, at the expense of the mental health of our children and young people. It is imperative to point out and denounce the responsibility of big tech companies as the main broadcasters of fake news, polarisation, violence, pornography, gambling, etc. In the Atresmedia Group we have a clear commitment to this cause, we will always be part of the solution and not the problem. We are a trusted medium” said Javier Bardají, CEO of the Atresmedia Group and employer of the Atresmedia Foundation.
Both bodies, referring to the promotion of a safe digital environment that are committed to the protection of children and adolescents, point out that, although young people have great ability to handle applications, search for content and navigation, they are not always aware of the risks involved in these actions. Family supervision and guidance is essential, especially when at an early age they are given their own mobile phone.
‘Not to the digital free bar’ reinforces the idea that allowing children early and unrestricted access to screens is to open the door to risks that may have an impact on their well-being. Thus, the spot visualises this problem through impacting images: a child only in an American bar and a girl in a betting room, accompanied by a clear and direct message: “If you will not let your child only be in a place like this..., it does not leave you alone in the digital world. It delays the delivery of the mobile phone to your children and adapt to your use.”
According to data shared by Arturo Béjar, former Head of Meta Protection, in the Special ‘Salvados’ of theSixth, ‘Social media, the terror factory’, 1 out of 8 children are sexually harassed in Instagram every 7 days; 1 in 5 feel worse with yourself after using the platform, and 1 out of 10 are bullying inside the platform every week. This data supports the importance of supervised and safe use of technology during childhood and adolescence, a central focus of the Atresmedia Foundation and AEPD initiative.
The campaign, which has been broadcast on Thursday under hashtag #Noalabarralibredigital, will be broadcast on all television channels of the Atresmedia Group: 3th antenna, Nova, Neox, Mega and Atreseries, and on their media, as well as on the AEPD website and social media and the Atresmedia Foundation’s website and social media.
- ‘There are more risks on the Internet than in real life’, a new campaign by the Agency and the General Council of Psychology
The Spanish Data Protection Agency (AEPD) and the Spanish General Council of Psychology (COP) launched their campaign ‘There are more risks on the internet than in real life’, aimed at enabling families to assess the consequences of giving their children a device with access to all kinds of internet services. The campaign is supported by Atresmedia, Mediaset España and RTVE, who, through their participation, strengthen their commitment to the rights of children and adolescents in the digital environment and will disseminate it through their respective channels.
More than 90 % of 1° students in ESO have their own mobile phone with an internet connection. In parallel, more than 80 % of parents are concerned about the time spent by children and adolescents with these devices, according to the CIS.
With the campaign ‘There are more risks on the internet than in real life’, the AEPD and the COP want to invite families to reflect on what it really means to deliver a smartphone to their children, equating the effects of certain internet services with the dependence and addiction that some substances generate.
The early and intensive use of digital media has an impact on health at the physical, psychological and social levels in the case of children and adolescents, and their delivery means opening the door to a number of situations that seriously damage both their privacy and, in extreme cases, their mental health: receipt and dispatch of promised photos; cyberbullying contacts with adults who pass through children, hyperconnectivity, etc.). As children are in full development of their personality, intensive use could have consequences for their neurodevelopment. Families have often not received information and are not fully aware of the effects of inappropriate or problematic and addictive use of certain internet services on children and adolescents, seriously affecting their personal development, and in particular their health (physical, mental, psychological and social, sexual); their neurodevelopment; their learning; family and social relationships; consumption habits or monetisation of your data.
In addition, over-exposure of personal information makes them more prone to risky situations such as cyberbullying, sexting or grooming, with consequences that are difficult to repair in some cases.
In this regard, the AEPD report ‘Addictive shoes in the processing of personal data’ focuses on platforms, applications and services whose business model, in many cases, relies on prolonging the time users spend on their platform and increasing their level of commitment and the amount of personal data collected. In general, all consumers of digital products are potential victims of addictive behaviour, as the techniques that can be used to get people to spend longer than recommended or healthy are increasingly sophisticated. The ultimate aim is to maintain attention for as long as possible without taking into account the possible harm it may cause to themselves or their family environment, even changing their will or behaviour with negative and sometimes irreversible consequences for their physical or psychological integrity.
The launch of the Agency’s campaign ‘There are more risks on the Internet than in real life’ together with the General Council of Psychology in Spain deepens its comprehensive strategy on children, digital health and privacy, which sets out this public body’s priority lines of action to promote the protection of children and adolescents in their use of the Internet and its services.
Data protection in Mexico

- Access to information is the basis of freedoms and rights.
Access to information is the cornerstone of freedoms, human rights and democratic participation, said Blanca Lilia Ibarra Cadena, Commissioner of the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI), during the commemoration of International Universal Access to Information Day, organised by UNESCO and held in Ghana. Access to information helps build transparent, accountable and inclusive societies, promotes social justice, facilitates the exercise of human rights such as the right to health and education, and strengthens public institutions, said Ibarra Cadena, speaking remotely as a representative of the International Conference of Information Commissioners (ICIC), which INAI chairs.
- Artificial intelligence and personal data, INAI's presentation on the third day of the FIL Monterrey
INAI and the National Transparency System (SNT) presented the new issue of the journal Society and Transparency, dedicated to artificial intelligence and the challenges it represents for the protection of personal data, on the third day of activities at the Monterrey International Book Fair.
- 10 tips to protect your phone's personal data
Your privacy could be at risk if someone accesses the information on your cell phone without your consent. Your personal data could be exposed if it is stolen, lost, or even if, through carelessness or ignorance, the settings on your device allow certain applications to access your videos, photos, or passwords without control.
The National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) gives you these 10 tips to protect the information and personal data stored on your cell phone.
1. Set a lock password to prevent someone from using it without your consent and reviewing the information it contains.
2. Use strong passwords: use uppercase, lowercase and numbers.
3. Download applications only from official sites or stores.
4. Check the automatic hosting permissions of cloud services.
5. Delete your browsing history frequently.
6. Check the applications that use geolocation to decide which ones you allow to know your geographic location and when.
7. Make backup copies of the information stored on your cell phone.
8. Read the privacy notices of the applications before downloading them to know how they will treat your personal data and what information on your phone you give them access to.
9. Do not exchange private or confidential information when you are connected to a free Wi-Fi network.
10. Avoid making online transactions or purchases when you are connected to a free Wi-Fi network.
- INAI publishes 7 interpretation criteria that promote human rights of access to information and the protection of personal data
The National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) published today in the Official Gazette of the Federation (DOF) seven criteria for the interpretation of the General Law on Transparency and Access to Public Information and the General Law on the Protection of Personal Data Held by Obligated Subjects, corresponding to the third period of criteria.
These criteria for interpretation reflect the work of the members of the INAI Plenary, which, through its resolutions, privileges the principles of universality, interdependence, indivisibility and progressiveness that support the human rights of access to information and the protection of personal data.
- 10 things you shouldn't post on social media
What you share on social media can reveal more than you think. A picture of where you work, your birthday celebration, your new car or tickets for your next holiday can be the pieces that allow a criminal to put together a strategy to usurp your identity, defraud you, extort you or become a victim of home burglary.
The National Institute for Transparency, Access to Information and Protection of Personal Data (INAI)
Personal Data Protection (INAI) recommends that you avoid posting the following on social networks:
1. Personal information that makes you identifiable. Publishing your full name, telephone number, place of work or address exposes you to identity theft.
2. Documents. When you post a birth certificate, travel tickets or your passport, you show data that someone can use to commit crimes.
3. Financial information. Sharing your bank card details or account numbers opens the door for criminals to commit theft or fraud.
4. Purchases of expensive items. Showing off valuable items such as a new car or house can reveal your financial standing and expose you to theft.
5. Travel plans. If you give notice of the time you will be away from home, someone could take advantage of this to commit crimes such as burglary while you're away.
6. Location. Sharing where you are in real time can create a pattern of habits and put you at risk.
7. Photographs of underage people. Your followers can take and share the images, putting children and adolescents at risk.
8. Complaints about your work. The Internet does not forget. Information you post about your workplace could affect your working relationships or cause complications in getting a new job.
9. Intimate details. Someone may pick up on your postings about relationship disputes or illnesses and cause more problems for you.
10. Personal information about family or friends. Showing details such as names or location of other people affects their privacy and exposes them to risk.
- Protect your privacy on Instagram and have fun wisely
Protecting your privacy on Instagram is important to reduce the risk of being a victim of harassment, fraud or identity theft. Instagram users often share details of their daily lives, such as their location, activities, photos and videos, which can be used by malicious individuals to commit crimes.
- Position of the INAI Plenary regarding the discussion in the Chamber of Deputies of the constitutional reform initiative that proposes the disappearance of this Institute
Position of the Plenary of the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) issued in today's ordinary session regarding the constitutional reform initiative on organic simplification that is being discussed in the Plenary of the Chamber of Deputies.
The next few hours will be crucial for the future of transparency, access to information and personal data protection in Mexico. The legislators of the LXVI Legislature have in their hands a historic decision and responsibility: the possible disappearance of INAI, along with other autonomous bodies, as part of the organic simplification ruling that will be discussed and voted on, according to the agenda, between today and tomorrow. Link
- Questions and answers about privacy for older adults
Older adults attending INAI's Second International Congress on Personal Data Protection and Privacy shared their concerns about how to surf the internet safely and how to protect their personal data when shopping online, banking or sharing messages or photos with friends or family.
Ileana Gama Benítez, Director of Information and Accessibility of the Federal Institute of Telecommunications (IFT), and Ileana Gama Benítez, Director of Information and Accessibility of the Federal Institute of Telecommunications (IFT), and Miriam Padilla Espinosa, Director of Private Sector Security at the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI), answered some of the most recurring questions asked by older adults:
• How secure are bank apps? If it is the bank's official app, it is a secure system, but it is recommended to avoid using it when connected to public internet networks.
• What do I do if I have been a victim of fraud? Specialists suggest reporting it to the courts. You can also go to the INAI if there was a risk of personal data theft.
• Is it safe to use public WiFi networks such as those offered by hotels? They are safe for ‘quick uses’, such as consulting Google, sending a message or checking a route to a specific address. Sharing personal information is not recommended.
- Going on vacation? Protect your personal data
If you are already packing your bags to enjoy the winter holidays, the National Institute for Transparency, Access to Information and Protection of Personal Data (INAI) guides you on how to take care of your personal information.
Protecting your personal data prevents you from falling victim to cybercriminals who can damage your assets, impersonate you or commit extortion. To avoid situations that compromise your peace of mind and security due to the improper use of your personal information, INAI recommends the following:
• Avoid posting information about your travel plans, photos, videos and/or real-time location on social media. Such information is evidence that you are not at home or at work and can be used to affect your assets.
• If parcels arrive at your home in your absence, ask someone you trust to receive them. Remember that it contains visible personal data.
• Set up security measures on your phone. Enable security settings such as password lock, remote wipe, encryption, data backup and 2-step verification.
• Travel with only the necessary documents. Don't carry all your IDs and bank cards - pack only the ones you know you'll need.
• Be wary of public Wi-Fi networks. Whoever administers them can monitor sensitive, private or confidential information transmitted over them.
• Avoid using online banking when connected to public networks.
• If you are using someone else's computer, such as in an internet café or hotel, use the incognito browsing option to prevent your personal and internet usage data from being stored.
• Read privacy notices to understand how the personal data you provide will be used and what it will be used for.
Data protection in India

- India restricts WhatsApp sharing data with other Meta entities, imposes $25.4 mln fine
India's competition watchdog directed WhatsApp to refrain from sharing user data for advertising purposes with other applications owned by Meta (META.O), opens new tab for a period of five years and fined the U.S. tech giant $25.4 million on Monday over antitrust violations related to the messaging application's 2021 privacy policy.
The Competition Commission of India (CCI) launched a probe in March 2021 into WhatsApp's privacy policy, which allowed data sharing with Facebook and its units, sparking global backlash.
"Sharing of user data collected on WhatsApp with other Meta companies... for purposes other than for providing WhatsApp service shall not be made a condition for users to access WhatsApp Service in India," the CCI said.
Meta did not immediately respond to Reuters' request for comment.
The Indian government is currently examining a February report from a panel established by the corporate affairs ministry. The report proposed a new "Digital Competition Bill" to complement existing antitrust laws.
The U.S.-India Business Council, a key U.S. lobby group has already opposed the move, fearing its business impact.
($1 = 84.3740 Indian rupees)
- Cross-Border Data Transfers: Best Practices under India’s Data Protection Laws
India passed the Digital Personal Data Protection Act, 2023 (“DPDPA”) – the nation’s first dedicated data protection statute – on August 11, 2023. The DPDPA undeniably marks a significant advance in aligning India’s domestic laws with international standards for data protection and privacy and is intended to replace the existing Indian Information Technology (Reasonable Security Practices and Procedures and Sensitive Personal Data or Information) Rules, 2011 (“SPDI Rules”).
Notably, the DPDPA sets out a comprehensive legal framework for protection of personal data of individuals residing in India, akin to the European Union (EU) General Data Protection Regulation (GDPR). Like the GDPR, the DPDPA has extra territorial application – it also regulates processing of personal data (of individuals resident in India) by a person located outside Indian territory, provided such processing is carried out in connection with offering of goods or services to Indian residents. Accordingly, any person engaged in processing personal data of Indian residents (in digital form) is required to comply with the provisions of the DPDPA – irrespective of whether or not such person is located in India.
While the DPDPA parallels the GDPR in certain aspects, however, it has its own distinct structure and requirements – including, notably, requirements in relation to regulation of the cross-border transfer of personal data. For Indian as well as foreign business organizations (whether targeting Indian consumers and/or engaging in cross border trade with Indian business partners) which are now subject to regulation under the DPDPA – it is accordingly exceedingly relevant to understand the scope and application of requirements set out thereunder in respect of transfer and processing of personal data of Indian residents and to implement necessary steps to ensure compliance thereto.
Moreover, it is also relevant for stakeholders to have awareness and understanding of statutory requirements ahead of the upcoming (eminent) release of the draft Rules formulated under the DPDPA by the Indian Government (for the purpose of implementation of statute). This is also since it is speculated that the draft Rules will likely provide a short transition period (of around 6-8 months) to stakeholders for statutory compliance.
Scope and Application of the DPDPA
The DPDPA regulates the processing of digital personal data of “Data Principals” (i.e. the individuals to whom the data relates). Simply put, the statute applies to the processing of any personal data in digital form – whether collected in digital form or collected in non-digitized format and subsequently digitized.
In relation, the DPDPA adopts a broad statutory definition of the term “personal data” – classifying this as any data about an individual “who is identifiable by or in relation to such data”. Further, where the individual to whom personal data relates comprises a child (i.e. any individual less than 18 years of age) the term Data Principal includes the parents or lawful guardian of such a child per the statue. The statute further stipulates that in so far as the personal data relates to any person with “disability” the term Data Principal would also include his/her lawful guardian
Further, the statue is also extra-territorial in application – as mentioned above, it applies to the processing of digital personal outside Indian territory, provided such processing is carried out in connection with offering goods or services to Data Principals located in India. Thereby, it applies to any business organization engaged in processing digital personal data of Indian consumers for commercial purposes, irrespective of whether the relevant business organization is located within India.
Key Stakeholders under the DPDPA
For the purpose of regulation, the DPDPA principally recognizes and demarcates stakeholders as “Data Fiduciary” and “Data Processor”. Under the statute, a “Data Fiduciary” comprises any person who, alone or in conjunction with other persons, determines the purpose and means of the processing of personal data. Meanwhile, the term “Data Processor” broadly encompasses any person who processes personal data on behalf of a Data Fiduciary.
The above designations employed by the statute are arguably akin (but not alike) to the designation of Data Processor and Data Controller under EU’s GDPR. It is important for international stakeholders, in particular, to note that these terms do not carry equivalent connotations in respect of statutory obligations.
Illustratively, the DPDPA places the primary liability for management, security and processing of personal data (and the protection of the rights and interests of Data Principals) on the Data Fiduciary. The Data Fiduciary is held responsible for the Data Processor(s) engaged by it, including for ensuring appropriate conduct and statutory compliance at the end of such Data Processor(s). It is anticipated, however, that the forthcoming Rules under the DPDPA could further elaborate upon the obligations and duties applicable to the Data Processors.
It is worth noting also that the DPDPA empowers the Indian Government to classify a certain Data Fiduciary or a certain class of Data Fiduciary as a “Significant Data Fiduciary”. The statute (non-exhaustively) stipulates that such classification may be based on factors such as the volume and sensitivity of the data processed by the Data Fiduciary, the risk of harm to the Data Principal and potential impact on the sovereignty and integrity of India and its security. The statute also prescribes enhanced compliance obligations for Significant Data Fiduciaries, which include the mandatory appointment of a local Data Protection Officer in India, engagement of an independent Data Auditor and the conduct of periodic Data Protection Impact Assessments. Further information on such measures and classification of the Significant Data Fiduciaries is also anticipated to be provided by the Indian Government under the forthcoming Rules.
Regulation of Cross Border Data Transfer
The DPDPA introduces certain important provisions in relation to cross-border processing and transfer of personal data, which are elaborated below.
• Restrictions on Transfer of Personal Data
The DPDA empowers the Indian Government to restrict the transfer of personal data by a Data Fiduciary to certain foreign countries or territories (as may be notified). Thereby, the Indian Government can exercise this statutory power to blacklist a foreign territory or country prospectively and prohibit stakeholders from transfer of any personal data thereto.
In this regard, it is worthwhile to note that the DPDPA doesn’t provide for the criteria basis which such restriction may be imposed by the Indian Government as regards a particular jurisdiction. However, it is speculated that clarity on this aspect may be incorporated in the draft Rules (to be issued under the DPDPA).
• Statutory Exemptions
The DPDPA exempts certain instances of data processing and transfer from prohibition (in exercise of the statutory power granted by the Indian Government. This exemption extends to instances where data processing is necessary for the following purposes:
1. for enforcing any legal right or claim;
2. for discharge of functions any competent court or judicial or quasi-judicial or authority in India;
3. for prevention, detection, investigation or prosecution of any offence or contravention of any law in force in India;
4. where personal data of Data Principals (located outside India) is processed subject to contract between an Indian and any foreign (offshore) entity;
5. for carrying out legally approved acquisition, merger or amalgamation or similar arrangement between two or more companies;
6. for ascertaining the financial information and assets and liabilities of any person who has defaulted in payment of a loan or advance taken from a financial institution
• Concurrent Application of Additional Laws
The DPDPA clarifies that its provisions will not impact existing law in India which provides for “…a higher degree of protection for or restriction on transfer of personal data by a Data Fiduciary outside India…” than the threshold of protection established under the DPDPA.
This indicates that while transferring personal data outside India, multinational and other organizations will also require to comply with any Indian laws (as may be applicable) which provide for higher degree of protection or restriction than the DPDPA itself.
Conclusion: Key Ramifications & Best Practices for Businesses
• Key Ramifications
The introduction of the DPDPA carries significant implications for domestic as well as international businesses engaged in trade in India – given its extraterritorial nature. With the enactment of the statute, a diverse set of stakeholders including multinational/international corporations or services providers with or without corporate presence in India, particularly in the e-commerce and IT industry are now subject to carry out compliance thereunder.
From a practical perspective, stakeholders under the statue – including international or domestic business(es) operating in India – may qualify as Data Fiduciary or Data Processor or both. While businesses qualifying simply as a Data Processor will admittedly have a relatively lesser compliance burden under the DPDPA, they can still expect to deal with contractual obligations and negotiations regarding statutorily prescribed practices and procedures as part of their business arrangements/dealings with Indian stakeholders.
Further, while the DPDPA does not impose specific restrictions or requirements on the transfer of data overseas, it – unlike the SPDI Rules – provides for prohibition of transfer of personal data to certain foreign jurisdictions or territories, as may be “blacklisted” by the Indian Government. This aspect of the DPDPA carries significant implications for businesses reliant upon outsourcing or overseas operations or otherwise operating in industries where data processing is integral to the offering of goods and services. Such businesses may face significant challenges in conducting business with Indian customers – since should the foreign country or territory, where important affiliates or partners are located, be blacklisted by the Indian Government subject to the DPDPA.
• Considerations & Best Practices for Businesses
If a territory or country is blacklisted by the Indian Government, it is implicit that any collection or processing of data by relevant affiliates or partners in such territory or country will also be restricted. For adequately safeguarding business interests, it thus becomes necessary for relevant stakeholders to seek advisory to align their practices with the procedures prescribed in the DPDPA as well as to understand the recourse available to them under the statute.
Illustratively, it can be inferred that the statutory exemptions set out in the DPDPA are largely intended to facilitate the discharge of official functions by law enforcement, banking and judicial authorities in India. For multinational and other business organizations, however, it is relevant to note that the statute exempts the processing of personal data where such processing is carried out as part of a merger or amalgamation or similar arrangement between two or more corporate entities. Further, it exempts the processing of data subject to a contract between a domestic party (located in India) and a foreign party.
Consequently, relevant stakeholders – in particular businesses involved in outsourcing services or goods to or from India or business groups having or seeking control or ownership of entities in India – can employ the aforementioned two exemptions as grounds to transfer personal data of Data Principals to a foreign jurisdiction which has been blacklisted by the Indian Government prospectively (in exercise of its powers under the DPDPA) – provided that data transfers are otherwise conducted in alignment with the requirements under the DPDPA.
To provide context, relevant considerations and requirements for stakeholders under the DPDPA include collection of informed consent from Data Principals, including for transfer of their; the management of such consent (including accounting for withdrawal of consent); employment of adequate protocols and contractual arrangements with third parties for maintaining confidentiality and security of data and/or handling of requests from Data Principals for retention/erasure/correction of data etc.
In addition to the DPDPA, stakeholders must be prepared also for additional compliance under applicable laws and sector-specific regulations in India which prescribe a higher threshold of protection for the transfer and protection of personal data in India. For reference, these include relevant regulations of the Reserve Bank of India (RBI), Telecom Regulatory Authority of India (TRAI), Securities and Exchange Board of India (SEBI) and the Insurance Regulatory and Development Authority (IRDAI), per which requirements are set out for applicable stakeholders in relation to localization of storage of certain kinds of data and records. Interestingly, the SPDI Rules are also included within the scope of applicable laws at present – and will remain applicable until the time the DPDPA is fully implemented in India.
- Key Updates on India's Digital Personal Data Protection (DPDP) Act in November 2024
In November 2024, the Indian government provided important updates on the Digital Personal Data Protection (DPDP) Act, 2023, which is set to transform data privacy practices across the country. The DPDP Act, passed in August 2023, aims to safeguard personal data and ensure that its processing is both transparent and responsible.
Legitimate Purpose(s) to Collect Data & Penalty
The act's enforcement is being phased in, with some provisions expected to be fully implemented in 2024. It mandates organizations to ensure that personal data is collected for specific, legitimate purposes and that individuals' consent is obtained unless a legitimate use case is specified. Additionally, it lays out strict guidelines for data minimization, retention, and security, with penalties for non-compliance that can reach up to ₹500 crore for serious breaches.
Child Data Protection
The DPDP Act also establishes a framework for cross-border data transfers and emphasizes the protection of children's data, requiring parental consent for data processing of individuals under 18. Furthermore, it defines the rights of "Data Principals" (individuals whose data is being processed), granting them the ability to access, correct, and erase their personal data.
Data Fiduciaries
Business in the role of data fiduciaries will have to ensure that the data is lawfully and securely processes while proactively following appropriate guidelines. The large-scale companies in a data-fiduciaries role will have to follow additional compliance requirements.
With its broad scope, the DPDP Act is expected to significantly impact both businesses and consumers, as companies must adapt to stricter data protection practices to ensure compliance.
Data protection in Brazil

- ANPD adopts internal personal data protection policy
The Internal Policy for the Protection of Personal Data of the National Data Protection Authority (ANPD) came into force. The document establishes guidelines and rules applicable to all employees during processing operations with the objectives of ensuring and reinforcing compliance with legislation, promoting transparency , accountability and accountability, and encouraging the adoption of good practices.
The text emphasizes that the Authority may only process personal data in accordance with articles 7 and 11 of the General Law on the Protection of Personal Data (LGPD) , which deal with legal hypotheses and sensitive personal data, respectively. It also establishes that only the information absolutely necessary to fulfill the specific purposes will be processed , that the data will be stored securely , and that it will be deleted after processing.
The rules extend to contracts, agreements and other instruments involving third parties. Therefore, these mechanisms must contain specific clauses that guarantee the protection of personal data.
The policy also involves the ANPD Officer by listing their duties and responsibilities , and their immediate superiors, who, within the scope of their functions , must incorporate good data protection practices into their routines, as well as raise awareness among their teams, ensure the protection of the data they process and maintain dialogue with the officer regarding any security incident.
- ANPD updates and includes new entries in the Glossary
The National Data Protection Authority (ANPD) has completed updating 40 more entries in its glossary, an essential tool for understanding and disseminating concepts related to the protection of personal data in Brazil.
With the approval of the Security Incident Reporting Regulation, the Regulation on the role of the person responsible for processing personal data and the Regulation on the International Transfer of Personal Data and the content of the standard contractual clauses, the need arose to include new entries in the ANPD Glossary.
The review was conducted by the General Coordination of Standardization (CGN), which ensured the accuracy and clarity of the terms. With the review completed, the glossary is now ready and available to the public on the Authority's website.
This initiative aims to promote greater understanding of the principles and standards established by the General Data Protection Law (LGPD), in addition to facilitating consultation for professionals and citizens interested in the topic.
- ANPD opens sanctioning process and issues determinations to TikTok
The National Data Protection Authority (ANPD) ordered TikTok to implement regularization actions and instituted administrative sanctioning proceedings to investigate potential irregular practices of processing personal data of children and adolescents.
The decision is the result of an inspection process that began in 2021, which is moving into a new stage with the measures announced today.
In the analysis of the technical area, evidence of violations of the General Law on the Protection of Personal Data (LGPD) was identified , especially with regard to the principle of the best interests of children and adolescents . According to this principle, the rights of children and adolescents must be observed as a priority, prevailing over other interests, in order to guarantee adequate protection of the personal data of these holders.
Furthermore, evidence of irregularities was found regarding the weakness of age verification mechanisms, combined with irregular data processing, which may constitute non-compliance with article 14 of the LGPD, which establishes guidelines for the protection of the rights of children and adolescents.
- ANPD and the Canadian Privacy Commissioner sign Memorandum of Understanding
Brazil’s National Data Protection Authority (ANPD) and the Canadian Privacy Commissioner (OPC) signed a Memorandum of Understanding (MOU) last Friday (1/11) during an official engagement in Jersey, aimed at cooperation and mutual assistance in the field of personal data protection. The agreement aims to strengthen institutional relations between the two nations and encourage technical, regulatory and oversight cooperation in this strategic area.
The measure aims to improve international collaboration on data protection, respecting the legal limits and principles of the Brazilian General Data Protection Law (LGPD). The Memorandum is not binding and does not provide for the transfer of financial resources, being of a political nature and guided by the promotion of the public interest.
- ANPD opens Call for Subsidies on AI
The Subsidy Collection process for the Artificial Intelligence and Data Protection regulatory project begins this Wednesday (6) . The consultation , initiated by the General Coordination of Standardization (CGN), will be available for 30 days and aims to guide and enrich the regulation through specialized contributions originating from different segments of society, including experts on the subject.
The Subsidy Request brings 15 (fifteen) questions organized into four blocks: LGPD Principles ; Legal Assumptions ; Data Subjects' Rights; Good Practices and Governance . The objective of these propositions, which have eminently technical and organizational content , is to identify the practices used and whether they have been aligned with the guidelines established by the General Data Protection Law (LGPD ) and the ANPD regulations. Thus, it is possible for the CGN and others involved in the project to guide regulatory activity in order to balance free initiative and technological development with the protection of data subject rights.
- ANPD publishes English versions of data protection documents
The NPD is advancing its commitment to transparency and international accessibility by making available English translations of key documents for the protection of personal data. The texts of the General Data Protection Law (Law No. 13,709, of August 14 , 2018) and the Regulation on International Data Transfers, as well as the content of standard contractual clauses (Resolution CD/ANPD No. 19, of August 23 , 2024), will now be available on the Documents and Publications page of the ANPD website and can be consulted worldwide, facilitating the understanding and adoption of Brazilian standards in different jurisdictions.
In translation, these documents allow , for example, processing agents from different jurisdictions to directly use the Standard Contractual Clauses in their Portuguese-English version, providing greater legal certainty and clarity in compliance with Brazilian legislation. The availability of these instruments in a bilingual format makes the adaptation process more practical, meeting both local requirements and the needs of international cooperation.
The translation of the LGPD and the RTID into English , and their widespread dissemination , are fundamental strategies to align Brazilian legislation with global standards for the protection of personal data, facilitating international cooperation, regulatory harmonization and legal certainty. The initiative allows Brazilian standards to be more accessible to foreign authorities, international organizations and multinational companies, promoting greater integration in data protection policies and encouraging international trade. In addition, it contributes to the recognition of Brazil as an active partner in the protection of personal data, strengthening the visibility and performance of the ANPD on the international scene.
This initiative represents an important milestone in the ongoing internationalization project of the ANPD, as it expands the reach of its publications and ensures that information on data protection in Brazil is accessible to an international audience. In addition, these translations reinforce Brazil's integration into the global data protection ecosystem, promoting interoperability of concepts, systems and mechanisms for international data transfer.
- ANPD publishes call for partnership in Artificial Intelligence and Data Protection Sandbox project
The National Data Protection Authority (ANPD) announces the opening of a call for proposals for partnerships with universities and public institutions interested in offering consultancy services for the ANPD's Experimental Regulatory Environment in Artificial Intelligence and Data Protection (“Regulatory Sandbox”).
The call for partnership is part of the cooperation established between ANPD and the United Nations Development Program (UNDP), with a duration of 20 months, as established in the notice.
The partnership aims to implement and execute the regulatory sandbox , so that the contracted institution will act in the evaluation of projects, in the implementation of the experimental regulatory environment and in the training and monitoring of the participants of the regulatory sandbox.
Public higher education institutions that carry out research in the area of Experimental Regulatory Environment in Artificial Intelligence and Data Protection and that meet the other criteria of the notice may apply.
Proposals for the partnership will be received until 01/24/2025.
- ANPD inspects 20 companies due to lack of a Manager and adequate communication channel
The National Data Protection Authority (ANPD) has initiated an inspection process involving 20 large companies that did not indicate the contact details of the Data Protection Officer, as required by Article 41 of the General Data Protection Law (LGPD). The measure also extends to organizations that , in addition to not providing an adequate communication channel to serve data subjects, offer channels that are not effective, making it difficult to exercise rights such as access, correction and deletion of personal data.
The initiative is part of the Monitoring Cycle and is aligned with the 2024-2025 Map of Priority Themes, which highlights the guarantee of the rights of holders as one of the central axes of the Authority's action.
- ANPD launches guide on the role of the Person in Charge
The National Data Protection Authority (ANPD) published, this Thursday ( 19/12 ), the guiding guide entitled “Performance of the Person Responsible for the Processing of Personal Data”.
The document complements Resolution CD /ANPD No. 18, of July 16, 2024, which establishes the Regulation on the actions of the person responsible for processing personal data.
The purpose of the guide is to provide guidance on the performance of this professional, facilitating the interpretation of the standard and contributing to the adequate execution of the activities provided for in the General Data Protection Law ( LGPD ).
Furthermore, the guide aims to indicate good practices for personal data processing agents. At the end of the document, in the appendices, suggestions for formal act models for the appointment of the person in charge are presented.
The person in charge acts as a communication channel between the data subject, the data processing agent and the ANPD. Among other activities, it is worth mentioning the guidance of employees and contractors of the data processing agent regarding the practices to be adopted in relation to the protection of personal data.
If you have any questions, please send us an email to datasecurity@catts.eu
Tags
Share
How can we help?
CATTS is your dedicated partner for comprehensive data protection and compliance solutions. From strategic guidance and customized training to data security assessments and regulatory monitoring, we empower businesses for ethical success in the digital age. Whether it's GDPR compliance, Privacy Impact Assessments, or incident response, CATTS ensures tailored strategies to your unique data protection needs.
Contact Us