European Union

The EU's comprehensive framework for data protection and digital rights

Overview

This page covers the EU-wide legal framework. EU member state pages cover national implementation, derogations, and country-specific laws, and reference back here for the foundational framework.

The European Union has established a comprehensive framework for data protection and digital rights. Unlike any other jurisdiction, the EU treats data protection as a standalone fundamental right, not merely a subset of privacy, but an independent constitutional guarantee with its own enforcement architecture.

This framework rests on two constitutional pillars. The Charter of Fundamental Rights of the European Union enshrines both the right to respect for private and family life (Article 7) and the right to protection of personal data (Article 8).[1] Article 8 is unique in international law: it establishes that personal data “must be processed fairly for specified purposes and on the basis of the consent of the person concerned or some other legitimate basis laid down by law,” that everyone has the right of access and rectification, and that compliance must be subject to control by an independent authority.[2] This final provision (Article 8(3)) is the constitutional basis for the Data Protection Authorities (DPAs) in every member state.

The Treaty on the Functioning of the European Union (TFEU) reinforces this foundation. Article 16(1) declares that “everyone has the right to the protection of personal data concerning them,” while Article 16(2) provides the legal basis for EU data protection legislation, including the GDPR, and requires that compliance be monitored by independent authorities.[3] The EU is unique among international organizations in enshrining data protection as a constitutional obligation in its founding treaty.

From this constitutional foundation, the EU has built a layered regulatory architecture: the GDPR for general data protection, the ePrivacy Directive for electronic communications, the Law Enforcement Directive for police and criminal justice processing, the Digital Services Act and Digital Markets Act for platform regulation, the AI Act for artificial intelligence, and the Data Governance Act and Data Act for data sharing and access. Together with decades of landmark case law from the Court of Justice of the European Union (CJEU), this framework has become one of the most influential privacy regimes in international law.

General Data Protection Regulation (GDPR)

The General Data Protection Regulation (Regulation (EU) 2016/679) is the cornerstone of EU data protection law. Adopted on April 27, 2016 and applicable from May 25, 2018, the GDPR replaced the 1995 Data Protection Directive and established a directly applicable, uniform legal framework across all EU member states.[4]

Scope and Extraterritorial Reach

The GDPR applies to the processing of personal data wholly or partly by automated means, as well as non-automated processing that forms part of a filing system. It does not apply to purely personal or household activities, common foreign and security policy, or law enforcement processing (which falls under the LED).[4]

One of the GDPR’s most significant features is its extraterritorial reach under Article 3. The Regulation applies to controllers and processors established in the EU regardless of where processing occurs (the establishment principle). Crucially, it also applies to non-EU entities whose processing relates to offering goods or services to EU data subjects, irrespective of whether payment is required, or monitoring the behavior of EU data subjects (the targeting principle).[5] This gives the GDPR extraterritorial reach, requiring US tech firms, Chinese social media platforms, and others to comply when targeting EU residents.

The Seven Principles (Article 5)

All processing of personal data must comply with seven foundational principles:[6]

1. Lawfulness, Fairness, and Transparency – Data must be processed lawfully, fairly, and in a transparent manner in relation to the data subject.

2. Purpose Limitation – Data must be collected for specified, explicit, and legitimate purposes and not further processed in a manner incompatible with those purposes.

3. Data Minimization – Data must be adequate, relevant, and limited to what is necessary in relation to the purposes for which it is processed.

4. Accuracy – Data must be accurate and, where necessary, kept up to date; every reasonable step must be taken to ensure inaccurate data is erased or rectified without delay.

5. Storage Limitation – Data must be kept in a form permitting identification of data subjects for no longer than is necessary for the purposes of processing.

6. Integrity and Confidentiality – Data must be processed in a manner that ensures appropriate security, including protection against unauthorized or unlawful processing and against accidental loss, destruction, or damage.

7. Accountability – The controller shall be responsible for, and be able to demonstrate compliance with, all of the above principles.

Six Lawful Bases for Processing (Article 6)

Processing is lawful only if at least one of the following applies:[7]

Consent (Article 6(1)(a)) – The data subject has given consent to processing for one or more specific purposes. Consent must be freely given, specific, informed, and unambiguous. For children under 16 (or as low as 13 depending on member state law), parental consent is required for information society services.

Contract (Article 6(1)(b)) – Processing is necessary for performance of a contract with the data subject, or to take steps at the data subject’s request prior to entering a contract.

Legal Obligation (Article 6(1)(c)) – Processing is necessary for compliance with a legal obligation to which the controller is subject.

Vital Interests (Article 6(1)(d)) – Processing is necessary to protect the vital interests of the data subject or another natural person.

Public Task (Article 6(1)(e)) – Processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority.

Legitimate Interests (Article 6(1)(f)) – Processing is necessary for the legitimate interests of the controller or a third party, except where overridden by the interests or fundamental rights of the data subject. This basis does not apply to processing by public authorities.

Data Subject Rights

The GDPR grants individuals a comprehensive set of rights over their personal data:[8]

Right to be Informed (Articles 13–14) – Right to receive clear information about how personal data is collected and used.

Right of Access (Article 15) – Right to obtain confirmation of whether personal data is being processed, access to that data, and information about the processing.

Right to Rectification (Article 16) – Right to have inaccurate personal data corrected without undue delay.

Right to Erasure (Article 17) – The “right to be forgotten”, the right to have personal data erased where it is no longer necessary, consent is withdrawn, processing is unlawful, or data was collected from a child for information society services.

Right to Restriction of Processing (Article 18) – Right to restrict processing where accuracy is contested, processing is unlawful, or pending verification of legitimate grounds.

Right to Data Portability (Article 20) – Right to receive personal data in a structured, commonly used, machine-readable format and transmit it to another controller.

Right to Object (Article 21) – Right to object to processing based on public task or legitimate interests, including profiling; absolute right to object to direct marketing.

Rights Related to Automated Decision-Making (Article 22) – Right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects or similarly significantly affects the data subject.

Special Categories of Data (Article 9)

Processing of certain sensitive categories of data is prohibited by default unless specific exceptions apply. These categories include: racial or ethnic origin, political opinions, religious or philosophical beliefs, trade union membership, genetic data, biometric data (for identification purposes), health data, and data concerning sex life or sexual orientation.[9] Exceptions include explicit consent, employment and social security obligations, vital interests, legal claims, substantial public interest, health and social care, public health, and archiving or research purposes. Criminal conviction data (Article 10) may only be processed under the control of official authority.

Controller and Processor Obligations

Data Protection Officer (Articles 37–39): A DPO must be designated where processing is carried out by a public authority; where core activities require regular and systematic monitoring of data subjects on a large scale; or where core activities involve large-scale processing of special category data or criminal conviction data. The DPO must be independent, must not receive instructions regarding their tasks, and reports directly to the highest management level.[10]

Data Protection Impact Assessment (Article 35): A DPIA is required where processing is likely to result in a high risk to rights and freedoms, particularly when using new technologies. It is mandatory for systematic and extensive profiling with significant effects, large-scale processing of special category data, and large-scale systematic monitoring of publicly accessible areas.[10]

Records of Processing Activities (Article 30): Controllers and processors must maintain written records including purposes of processing, categories of data subjects and data, recipients, international transfers, retention periods, and security measures.

Data Breach Notification (Articles 33–34)

Controllers must notify the supervisory authority without undue delay and within 72 hours of becoming aware of a personal data breach likely to result in a risk to rights and freedoms. Data subjects must be notified without undue delay when the breach is likely to result in a high risk to their rights and freedoms, unless appropriate technical measures (such as encryption) were applied or subsequent measures have eliminated the risk.[11]

International Data Transfers (Chapter V)

The GDPR establishes a tiered framework for transferring personal data outside the EU:[12]

Adequacy Decisions (Article 45): The European Commission may determine that a third country ensures an adequate level of protection. Transfers can then occur without specific authorization. See the Adequacy Decisions section below for the full list of 17 current decisions.

Standard Contractual Clauses (Article 46(2)(c)): Pre-approved contractual clauses adopted by the Commission. The current SCCs were adopted on June 4, 2021 (Commission Implementing Decision 2021/914). Following the Schrems II judgment, supplementary measures may be needed in addition to SCCs.[13]

Binding Corporate Rules (Article 47): Internal rules adopted by a corporate group for transfers within the group to entities in third countries. Must be approved by the competent supervisory authority.

Derogations (Article 49): In the absence of adequacy decisions or appropriate safeguards, transfers may occur based on explicit consent, contract necessity, important public interest, legal claims, vital interests, or transfer from a public register.

Enforcement and the One-Stop-Shop

Each EU member state must provide for one or more independent supervisory authorities (DPAs) to monitor GDPR application. For cross-border processing, the one-stop-shop mechanism (Article 56) designates the DPA of the controller’s main establishment as the Lead Supervisory Authority. Ireland’s Data Protection Commission serves as the LSA for many major tech companies, including Meta, Google, Apple, Microsoft, TikTok, and LinkedIn, due to their European headquarters being located in Ireland.[4]

The EDPB’s consistency mechanism (Articles 63–67) ensures uniform application through opinions on draft decisions, dispute resolution, and binding decisions where supervisory authorities disagree.

Penalties (Article 83)

Tier 1 (Article 83(4)): Up to €10 million or 2% of worldwide annual turnover (whichever is higher) for violations related to controller/processor obligations, certification, and monitoring bodies.

Tier 2 (Article 83(5)): Up to €20 million or 4% of worldwide annual turnover (whichever is higher) for violations of basic principles, data subject rights, international transfer provisions, and non-compliance with supervisory authority orders.[14]

Top GDPR Fines

By early 2025, approximately 2,245 GDPR fines had been recorded, totaling approximately €5.65 billion in cumulative penalties.[15] The largest enforcement actions include:

Meta / Facebook – €1.2 billion (Ireland DPC, May 2023) – Transferring personal data of EU/EEA Facebook users to the US without adequate legal basis, violating Article 46(1).[16]

Amazon – €746 million (Luxembourg CNPD, July 2021) – Processing personal data for behavioral advertising without proper consent.[16]

TikTok (ByteDance) – €530 million (Ireland DPC, May 2025) – Transferring European users’ data to servers in China without adequate safeguards.[17]

Meta / Instagram – €405 million (Ireland DPC, September 2022) – Publicly disclosing children’s email addresses and phone numbers via business accounts.[16]

Meta / Facebook & Instagram – €390 million (Ireland DPC, January 2023) – Improperly relying on “contract” as the legal basis for personalized advertising instead of consent.[16]

Google – €325 million (France CNIL, September 2025) – Sending Gmail users unsolicited advertising emails disguised as regular messages.[17]

LinkedIn – €310 million (Ireland DPC, October 2024) – Processing user data without valid consent for targeted advertising.[16]

Uber – €290 million (Netherlands DPA, August 2024) – Transferring European taxi drivers’ personal data to the US without adequate safeguards.[16]

Meta – €251 million (Ireland DPC, December 2024) – 2018 security breach affecting approximately 29 million Facebook accounts globally.[18]

WhatsApp – €225 million (Ireland DPC, September 2021) – Failure to meet transparency requirements under Articles 12–14.[16]

Other notable fines include Clearview AI – €30.5 million from the Netherlands DPA (September 2024), €20 million from Italy’s Garante (March 2022), and €20 million from Greece’s HDPA (July 2022) for building an illegal facial recognition database by scraping billions of images from the internet.[16]

ePrivacy Directive (2002/58/EC)

The ePrivacy Directive, also known as the “Cookie Directive,” complements the GDPR by establishing specific privacy rules for the electronic communications sector. Adopted on July 12, 2002 and significantly amended by Directive 2009/136/EC on November 25, 2009, it governs the confidentiality of communications, cookies and tracking technologies, traffic and location data, and unsolicited marketing.[19]

Key Provisions

Confidentiality of Communications (Article 5(1)): Member states must ensure the confidentiality of communications and related traffic data. Listening, tapping, storage, interception, or surveillance is prohibited without user consent unless legally authorized.

Cookies and Tracking (Article 5(3)): The storing of information or gaining access to information already stored on a user’s terminal equipment is only allowed with consent or if strictly necessary for providing a service explicitly requested by the user. This covers cookies, device fingerprinting, tracking pixels, and similar technologies.[20]

The 2009 amendment (Directive 2009/136/EC) was a significant change: it shifted from an opt-out to an opt-in consent requirement for cookies and tracking technologies. Before 2009, websites could set cookies unless users opted out. After 2009, prior informed consent became mandatory, with the exception of strictly necessary cookies.

Traffic Data (Article 6): Traffic data relating to subscribers and users must be erased or made anonymous when no longer needed for the purpose of the transmission of a communication. This includes data identifying the source and destination of communications, routing, timing, duration, and size.

Location Data (Article 9): Location data other than traffic data may only be processed when made anonymous or with the consent of the subscriber/user, and only for the duration necessary for providing a value-added service. Users must be informed beforehand and given the option to withdraw consent at any time.

Unsolicited Communications (Article 13): The Directive establishes an opt-in regime for electronic marketing communications. Direct marketing by email, SMS, or automated calling systems is prohibited without prior consent, with a limited “soft opt-in” exception for existing customer relationships where marketing relates to similar products or services.

Calling Line Identification (Articles 7–8): Users must have the ability to eliminate the presentation of calling line identification on a per-call and per-line basis, free of charge.

The Failed ePrivacy Regulation

On January 10, 2017, the European Commission proposed a Regulation on Privacy and Electronic Communications to replace the Directive and align it with the GDPR. The European Parliament adopted its position in October 2017, but the Council of the EU did not reach its position until February 2021, nearly four years later. Trilogue negotiations then stalled.[21]

On February 11, 2025, the European Commission disclosed in its 2025 Work Programme that it would withdraw the ePrivacy Regulation proposal, citing “no foreseeable agreement” and describing the proposal as “outdated in view of some recent legislation in both the technological and the legislative landscape.” The formal withdrawal was approved during the Commission’s 2533rd meeting on July 16, 2025.[22]

The failure was driven by fundamental disagreements between member states, tension between privacy advocates and the advertising and telecom industries, debates over metadata processing and cookie walls, and the Commission’s shift in focus toward AI competitiveness under the 2025 EU Digital Omnibus package.[23]

The ePrivacy Directive (2002/58/EC, as amended in 2009) remains in force and continues to be implemented through national law in each member state. An interim derogation was extended until 2028.[24]

Law Enforcement Directive (2016/680)

The Law Enforcement Directive (LED), also called the “Police Directive,” was adopted alongside the GDPR on April 27, 2016 and had the same May 6, 2018 transposition deadline. It establishes specific data protection rules for the processing of personal data by competent authorities for law enforcement purposes, including police forces, prosecutors, and courts exercising criminal jurisdiction.[25]

Key Differences from GDPR

While the LED and GDPR share the same constitutional basis (Article 16(2) TFEU), they differ in several important respects. The LED is a Directive requiring national transposition, not a directly applicable Regulation. It relies on a single lawful basis (processing must be necessary for a task carried out by a competent authority for law enforcement) rather than the GDPR’s six lawful bases. Consent is generally not relied upon due to the power imbalance in law enforcement contexts.[26]

Data subject rights may be exercised indirectly through the supervisory authority where direct access could prejudice ongoing investigations. The LED imposes unique requirements including the obligation to distinguish between categories of data subjects (suspects, convicted persons, victims, witnesses) and between facts and personal assessments. Logging requirements for automated processing are more granular than the GDPR’s records of processing. Penalties are left to member states to determine, rather than being specified in the instrument itself.[25]

Key LED-Specific Provisions

Data Quality (Article 7): The LED requires a unique distinction between categories of data subjects: (a) persons suspected of or about to commit a criminal offence; (b) persons convicted; (c) victims or potential victims; (d) witnesses, contacts, associates, and informants. This categorization requirement has no equivalent in the GDPR.[25]

Facts vs. Assessments (Article 7(2)): Personal data based on facts must be distinguished from personal data based on personal assessments, an important safeguard against the propagation of unverified intelligence through law enforcement databases.

Time Limits (Article 5): Appropriate time limits must be established for the erasure of personal data or for periodic review of the need for continued storage.

Supervision remains with the same national DPAs that enforce the GDPR, but exercising competence under the LED framework. Many member states had difficulty meeting the May 2018 transposition deadline, and implementation varies across the EU.[27]

Data Governance Act (2022/868)

The Data Governance Act (DGA), adopted on May 30, 2022 and applicable from September 24, 2023, is a key pillar of the EU’s European Strategy for Data. It aims to increase trust in data sharing, create rules on the neutrality of data intermediaries, and facilitate the reuse of certain public sector data.[28]

The DGA introduces three main mechanisms. Public Sector Data Reuse (Chapter II) establishes conditions for reusing protected public sector data while respecting existing protections through anonymization, pseudonymization, and secure processing environments. Data Intermediary Services (Chapter III) creates a notification and registration framework for entities that facilitate data sharing; intermediaries must remain neutral and cannot use shared data for their own purposes. Data Altruism (Chapter IV) establishes a voluntary EU framework for organizations that collect data made available voluntarily by individuals or companies for objectives of general interest, such as scientific research or public health.[29]

The DGA also establishes a European Data Innovation Board (Chapter VI), an expert group to advise the Commission on data governance practices, cross-sector standards, and data altruism.

Implementation has been uneven. On September 24, 2024, the Commission released an implementing guidance document. However, on December 16, 2024, the Commission sent reasoned opinions to ten member states, including Germany, Austria, Poland, and Portugal, for failure to designate the responsible authorities required to implement the DGA.[28]

Digital Services Act (2022/2065)

The Digital Services Act (DSA) establishes a comprehensive regulatory framework for digital intermediary services in the EU. Adopted on October 19, 2022, it became applicable to Very Large Online Platforms (VLOPs) and Very Large Online Search Engines (VLOSEs) from August 25, 2023, and to all other providers from February 17, 2024.[30]

Tiered Obligations

The DSA creates a layered framework based on the type and size of service. All intermediary services must provide transparency reporting and cooperate with authorities. Hosting services must additionally provide notice-and-action mechanisms and statements of reasons for content moderation decisions. Online platforms must further implement complaint handling, out-of-court dispute resolution, trusted flagger systems, and advertising transparency. At the top tier, VLOPs and VLOSEs (platforms with 45 million or more monthly active users in the EU) face the strictest obligations: systemic risk assessment, independent audits, risk mitigation, crisis response mechanisms, data access for researchers, and compliance officers.[30]

Key Platform Obligations

The DSA’s privacy-relevant provisions include a prohibition on targeting advertising based on profiling using special categories of personal data (Article 26(3)), a prohibition on dark patterns (Article 25) that deceive or manipulate users, a requirement that VLOPs offer at least one recommender system option not based on profiling (Article 38), and enhanced protections for minors (Article 28).[30]

Designated VLOPs and VLOSEs

As of February 2026, designated VLOPs include: AliExpress, Amazon Store, Apple AppStore, Booking.com, Facebook, Google Play, Google Maps, Google Shopping, Instagram, LinkedIn, Pinterest, Pornhub, Shein, Snapchat, Temu, TikTok, X (Twitter), Wikipedia, XNXX, XVideos, YouTube, and Zalando. Designated VLOSEs are Bing and Google Search.[31]

Enforcement

The European Commission directly supervises VLOPs and VLOSEs, while national Digital Services Coordinators supervise smaller providers. Penalties reach up to 6% of worldwide annual turnover, with periodic penalties up to 5% of average daily worldwide turnover. As of November 2025, the Commission had started 14 investigations into DSA compliance, including proceedings against AliExpress, Facebook, Instagram, Temu, TikTok, and X.[32]

Protection of Minors – Article 28 and Age Verification

Article 28(1) DSA requires online platforms accessible to minors to “put in place appropriate and proportionate measures to ensure a high level of privacy, safety and security of minors on their service.”[73] This obligation extends beyond child-specific services to any online platform that minors may access.

On July 14, 2025, the European Commission published final guidelines on the protection of minors under the DSA, providing detailed guidance to platforms on meeting their Article 28 obligations.[74] The guidelines distinguish between three age assurance approaches:

1. Self-Declaration – Users enter their date of birth or click a confirmation box. The Commission considers this approach unreliable for age assurance purposes, as it can be easily circumvented.

2. Age Estimation – Platforms use tools such as facial analysis, behavioral patterns, or device fingerprinting to estimate the user’s age without requiring identity documents.

3. Age Verification – Platforms check official identity documents or trusted digital credentials, such as the upcoming EU Digital Identity Wallet (expected 2026). This provides the highest level of age assurance but raises privacy concerns if not implemented correctly.

On the same day, the Commission made available a blueprint for an age verification solution designed to enable users to prove they are over 18 when accessing restricted adult content, such as online pornography, without revealing any other personal information.[75] The blueprint is based on open-source technology and designed to be robust, user-friendly, privacy-preserving, and fully interoperable with future European Digital Identity Wallets.

The Commission’s guidelines emphasize that age verification methods must be proportionate to the risks posed by the platform. Self-declaration may be sufficient for platforms with low risks to minors, while platforms with significant risks (such as those featuring violent content, adult material, or commercial surveillance advertising) must use more robust methods. Critically, the guidelines state that age verification must not become a pretext for mass surveillance or the creation of centralized databases linking users to their online activities.

The DSA guidelines will serve as the Commission’s reference point for assessing platform compliance with Article 28 and may inform national Digital Services Coordinators in their enforcement actions. Platforms that fail to implement appropriate age assurance measures face the DSA penalty framework described above.

Digital Markets Act (2022/1925)

The Digital Markets Act (DMA) regulates large platforms that serve as “gatekeepers” between businesses and consumers. Adopted on September 14, 2022, applicable from May 2, 2023, with a compliance deadline of March 7, 2024 for designated gatekeepers.[33]

Gatekeeper Designation

A platform is designated as a gatekeeper if it has a significant impact on the internal market (annual EEA turnover exceeding €7.5 billion or market capitalization exceeding €75 billion), provides a core platform service that is an important gateway for business users to reach end users (45+ million monthly active end users and 10,000+ yearly active business users in the EU), and enjoys an entrenched and durable position.[33]

As of 2025, seven gatekeepers have been designated across 24 core platform services:[34]

Alphabet (Google) – Google Search, YouTube, Android, Chrome, Maps, Play, Shopping, and Ads (8 services).

Amazon – Amazon Marketplace and Amazon Ads (2 services).

Apple – iOS, iPadOS, Safari, and App Store (4 services).

ByteDance – TikTok (1 service).

Meta – Facebook, Instagram, WhatsApp, Messenger, Meta Marketplace, and Meta Ads (6 services).

Microsoft – Windows, LinkedIn, and Microsoft Ads (3 services).

Booking Holdings – Booking.com (1 service, designated May 2024).

Key Obligations

Privacy-relevant DMA obligations include a prohibition on combining personal data from the core platform service with data from other gatekeeper services or third-party sources unless the user gives specific GDPR consent (Article 5(2)). Gatekeepers must also not sign in users to other gatekeeper services in order to combine personal data (Article 5(2)).[35]

Data Portability and Interoperability: Gatekeepers must allow end users to port their data (Article 6(9)), provide effective interoperability with their messaging services at the request of third-party providers (Article 7), notably affecting WhatsApp and Messenger, and allow business users access to data generated by their activities on the platform (Article 6(10)).

Anti-Steering and Sideloading: Gatekeepers must allow business users to promote offers and conclude contracts outside the platform (Article 5(4)) and allow installation of third-party applications and app stores on their operating systems (Article 6(4)).

Self-Preferencing Prohibition: Gatekeepers may not treat their own products more favorably in ranking than similar third-party products (Article 6(5)).

Advertising Transparency: Gatekeepers must provide advertisers and publishers with information about prices and fees (Articles 5(9)–(10)) and free access to performance measurement tools (Article 6(8)).

Penalties

Fines reach up to 10% of worldwide annual turnover (up to 20% for repeat offenses), with periodic penalties up to 5% of average daily worldwide turnover. In cases of systematic non-compliance, the Commission may impose structural remedies including divestiture.[33]

Cryptography and Export Controls

The European Union maintains a comprehensive export control regime for cryptographic products based on the Wassenaar Arrangement on Export Controls for Conventional Arms and Dual-Use Goods and Technologies. While encryption use within the EU is unrestricted, exports of cryptographic technology outside the Union are subject to licensing requirements.[83]

Legal Framework

The EU’s export control regime for cryptography is established by two principal regulations:

Council Regulation (EC) No 1334/2000 (June 22, 2000) sets up a Community regime for the control of exports of dual-use items and technology. It was the EU’s first common framework for export controls, superseding previous national regimes.[83]

Regulation (EU) 2021/821 (May 20, 2021), which repealed and replaced the older Regulation (EC) No 428/2009, sets up a modernized Union regime for the control of exports, brokering, technical assistance, transit, and transfer of dual-use items. This regulation includes cryptographic equipment, software, and technology in Category 5 Part 2 of Annex I.[84]

Wassenaar Arrangement Connection

The EU dual-use export control list is directly based on the Wassenaar Arrangement’s Dual-Use List. The Wassenaar Arrangement, established in 1996, is an international export control regime with 42 participating states (as of 2026) including all EU member states, the United States, the United Kingdom, Switzerland, and Japan.[85]

Cryptographic items controlled under the Wassenaar Arrangement include:

  • Symmetric cryptography – Products using symmetric algorithms with key lengths exceeding 56 bits
  • Asymmetric cryptography – Products using asymmetric algorithms with key lengths exceeding 512 bits
  • Quantum cryptography – Products implementing quantum key distribution and related technologies
  • Cryptanalytic systems – Systems designed to perform cryptanalytic functions

The Wassenaar list is updated annually, and the EU implements these updates through amendments to the Annex of Regulation 2021/821.

Scope of Controls

All information security items subject to export controls in the EU are listed in Category 5 Part 2 of Annex I to Regulation 2021/821. Exports of these items to destinations outside the EU require an export authorization unless they qualify for an exemption.[84]

Key exemptions and special provisions include:

  • Intra-EU transfers: Transfers between EU member states are fully liberalized, with no licensing required
  • Mass-market cryptography: Certain widely available commercial encryption products may qualify for simplified procedures or license exceptions
  • Open source software: Cryptographic software that is publicly available and in the public domain may be exempt from licensing requirements, though this remains a complex legal area with varying interpretations among member states

Export Licensing

EU member states maintain national export control authorities responsible for reviewing and issuing licenses. While the EU sets common standards through Regulation 2021/821, member states retain authority over individual licensing decisions.

Export authorization types include:

  • Individual export authorizations: Granted for one end-user and one or more items
  • Global export authorizations: Cover one or multiple items to one or more end-users in one or more third countries
  • EU general export authorizations: Available for exports to a select group of trusted destinations

Trusted Destinations

The EU maintains an EU general export authorization for cryptographic exports to a select group of non-EU countries with advanced export control systems. This streamlines procedures for exports to trusted partners including the United States, Canada, Australia, New Zealand, Switzerland, Norway, and Japan, though specific items and end-uses may still require individual licenses.[86]

Enforcement

Violations of export control regulations are subject to enforcement by member state authorities. Penalties vary by member state but typically include fines, imprisonment, and seizure of goods. Companies found in violation may face export privilege denial, preventing them from engaging in future export activities.

Relationship to Encryption Policy

Critically, the EU’s export control regime applies only to exports outside the Union. It does not restrict the development, use, or distribution of encryption within the EU. This represents a fundamental policy choice: while the EU seeks to prevent proliferation of cryptographic technology to destinations where it may threaten security, it maintains strong protection for encryption use domestically.

The EU’s stance on encryption has been tested by proposals such as Chat Control (the CSAM Regulation), discussed in detail below, which would have required client-side scanning of encrypted communications. The October 2025 blocking minority that forced removal of those provisions reflects a broader European commitment to protecting encryption architectures despite law enforcement and intelligence pressures.[78]

AI Act (2024/1689)

The EU AI Act is the first broad regulatory framework for artificial intelligence adopted by a major jurisdiction. Published in the Official Journal on July 12, 2024 and entering into force on August 1, 2024, it follows a phased implementation schedule through August 2, 2027.[36]

Phased Implementation

February 2, 2025 – Prohibited AI practices must cease; AI literacy obligations begin.

August 2, 2025 – Governance provisions and obligations for general-purpose AI (GPAI) models take effect.

August 2, 2026 – Full application: high-risk AI system requirements (Annex III), conformity assessments, CE marking, EU database registration.

August 2, 2027 – Requirements for high-risk AI systems covered by other EU harmonization legislation (Annex II) take full effect.[37]

Risk-Based Classification

The AI Act classifies AI systems into four tiers. Unacceptable risk systems are outright banned. High risk systems must meet strict requirements before market placement. Limited risk systems have specific transparency obligations (e.g., chatbots must disclose they are AI, deepfakes must be labeled). Minimal risk systems are subject to no specific regulatory requirements.[36]

Prohibited AI Practices (Article 5)

As of February 2, 2025, the following are banned:[38]

1. Subliminal, manipulative, or deceptive techniques to distort behavior causing significant harm.

2. Exploitation of vulnerabilities of specific groups (age, disability, social or economic situation).

3. Social scoring by public authorities leading to detrimental treatment unrelated to the original context.

4. Predictive policing based solely on profiling or personality traits.

5. Untargeted scraping of facial images from the internet or CCTV to build facial recognition databases.

6. Emotion recognition in the workplace and educational institutions (except for medical or safety purposes).

7. Biometric categorization to deduce race, political opinions, trade union membership, religious beliefs, sex life, or sexual orientation.

8. Real-time remote biometric identification in publicly accessible spaces for law enforcement (with narrow exceptions).

Biometric Identification Restrictions

Real-time remote biometric identification in public spaces for law enforcement may only be authorized in three narrow situations: targeted search for victims of abduction, trafficking, or sexual exploitation and search for missing persons; prevention of a specific and imminent threat to life or a genuine terrorist threat; and localization of a person suspected of a criminal offence punishable by at least four years. Even in these cases, prior judicial authorization, a fundamental rights impact assessment, and EU database registration are required.[39]

The AI Act intersects with the GDPR: data used for training AI must comply with GDPR requirements, high-risk AI systems processing personal data must comply with both frameworks, and national DPAs are expected to play a role in enforcement where AI systems process personal data.[40]

Data Act (2023/2854)

The Data Act, adopted on December 13, 2023 and applicable from September 12, 2025, establishes a user-centric access and sharing regime for both personal and non-personal data generated by connected products (IoT) and related services. It empowers users, both consumers and businesses, with greater control over data generated by their connected devices, from smart home appliances and wearables to cars and industrial machinery.[41]

Manufacturers must design products to make data easily, securely, and directly accessible to users (Article 3). Users have the right to access their data and share it with third parties in a commonly used, machine-readable format, free of charge (Articles 4–5). Contractual terms for data access must be fair, reasonable, and non-discriminatory, with protections against unfair terms imposed by parties with stronger bargaining positions.[42]

Business-to-Business Data Sharing (Chapter IV): Applying the fair, reasonable, and non-discriminatory terms noted above, the Act further prohibits unfair contractual terms imposed by parties with significantly stronger bargaining positions and limits what can be charged to micro, small, and medium-sized enterprises to the cost of making data available.[42]

Government Access to Data (Chapter V): Public sector bodies and EU institutions may request access to privately held data where there is an exceptional need (e.g., public emergency or official statistics mandate). Data must be requested as a last resort, and requests must be specific, proportionate, and respect trade secrets.

Cloud Switching and Interoperability (Chapter VI): Cloud and SaaS providers must eliminate technical and contractual lock-ins. Data must be portable within 30 days, and switching charges must be progressively eliminated (fully by January 12, 2027). Providers must ensure interoperability with other data processing services.[41]

Member states were required to adopt national enforcement regimes by September 12, 2025. Penalties must be effective, proportionate, and dissuasive, with specific fine amounts determined at the national level.

Proposed: Chat Control (CSAM Regulation)

The European Union’s proposed Regulation laying down rules to prevent and combat child sexual abuse, widely known as “Chat Control 2.0,” is a highly contested legislative proposal. First published by the European Commission on May 11, 2022, the regulation aims to combat online child sexual abuse material (CSAM) and grooming by requiring platforms to detect, report, and remove such content.[76] The proposal has generated significant debate over its implications for encryption, privacy, and the security of digital communications.

The Encryption Controversy

The most contentious aspect of the proposal is its approach to end-to-end encrypted communications. Under the Commission’s original proposal, platforms subject to detection orders would face three options:[77]

  • Abandon end-to-end encryption – Allowing authorities direct access to message content
  • Introduce a “backdoor” – Building in deliberate security weaknesses to enable lawful access
  • Implement client-side scanning – Analyzing message content on users’ devices before encryption

All three approaches would fundamentally undermine the security guarantees of end-to-end encryption. Client-side scanning, the approach most frequently cited as a “compromise,” requires platforms to build surveillance capabilities directly into user devices, creating what cryptographers describe as a functional backdoor.[78] Once the infrastructure exists to scan for CSAM, there is no technical limitation preventing its expansion to scan for other content, including political dissent, journalistic sources, or any other category a government might wish to monitor.

Technology companies including Apple, Meta (WhatsApp), and Signal have stated they would rather exit the European market than comply with requirements that undermine encryption.[79] Security researchers, cryptographers, and civil liberties organizations have broadly opposed client-side scanning, arguing it increases the attack surface for cyberattacks, data breaches, and foreign interference while potentially degrading the cybersecurity of billions of users worldwide.

Timeline and Political Developments

The proposal’s path through the EU legislative process has been marked by repeated delays, blocking coalitions, and political reversals:

June 2024: Voting on the legislation was temporarily withdrawn by the EU Council following pushback from member states and technology companies.[80]

July 2025: Denmark assumed the EU Council presidency and stated that “the Presidency will give the work on the Child Sexual Abuse (CSA) Regulation and Directive high priority.” Denmark pushed for a Council vote by October 2025.

October 31, 2025: Germany led a blocking minority of member states, forcing the Danish presidency to pull the mandatory client-side scanning provision from the vote. This represented a significant victory for privacy advocates and encryption proponents.[81]

November 26, 2025: The EU Council passed an amended version of the proposal with the mandatory client-side scanning requirement removed. However, the new version includes mandatory age verification requirements for communication providers. Big Tech companies (Google, Meta) remain able to continue voluntary scanning of unencrypted content.[82]

January 1, 2026: Poland assumes the rotating EU Council presidency. Existing voluntary scanning provisions (adopted under previous temporary measures) are set to expire in April 2026, creating political pressure to finalize the regulation before the deadline.

February 26, 2026: Second trilogue negotiation scheduled between the Council, Parliament, and Commission. Negotiations are ongoing as of early 2026.

Current Status – February 2026

As of February 2026, the Chat Control proposal remains in negotiation with the following elements:

Removed: Mandatory client-side scanning of encrypted communications, removed from the November 2025 Council version following the Germany-led blocking minority described above.

Added: Mandatory age verification for communication providers. The amended proposal shifts focus from content scanning to age assurance, requiring platforms to verify that users are above a certain age (typically 16 or 18 depending on national implementation) before allowing access to communication services.

Retained: Voluntary scanning by Big Tech platforms. Companies like Google (Gmail) and Meta (Facebook Messenger, Instagram) may continue to voluntarily scan unencrypted content for CSAM using existing automated tools. The regulation would codify and extend these voluntary practices.

Ongoing Debate: Privacy advocates warn that mandatory age verification creates its own privacy risks, potentially requiring users to submit identity documents or biometric data to access basic communication services. They argue this could create centralized databases linking real identities to online communications, a form of mass surveillance infrastructure that could be exploited by authoritarian regimes, hackers, or future governments with different priorities.

The April 2026 Deadline

As noted in the timeline above, existing EU measures allowing voluntary CSAM detection expire in April 2026. If no new regulation is adopted by that date, the legal basis for current voluntary scanning practices may lapse, creating a “cliff” scenario that could lead to compromises that neither fully protect encryption nor effectively combat CSAM.

Privacy advocates argue the April deadline is being used to manufacture urgency and bypass democratic scrutiny. They note that zero evidence has been presented that end-to-end encryption has caused a measurable increase in child abuse, and that the overwhelming majority of CSAM is found on unencrypted platforms where detection already occurs.[83]

Technical and Civil Liberties Concerns

Organizations including the Electronic Frontier Foundation (EFF), European Digital Rights (EDRi), the Internet Society, and prominent cryptographers have raised fundamental objections:

  • No “backdoor for good guys only”: Any mechanism to bypass encryption, whether for law enforcement, child safety, or other purposes, creates a vulnerability that can be exploited by malicious actors. There is no way to build a backdoor that only legitimate authorities can use.
  • Chilling effects on free expression: Knowledge that communications are being scanned, even if only for CSAM, will deter legitimate speech, particularly for journalists, activists, lawyers, and others who rely on confidential communications.
  • Mission creep: Once infrastructure for scanning exists, it will inevitably be expanded. What begins as CSAM detection could extend to copyright infringement, hate speech, misinformation, terrorism content, or political dissent.
  • False positives: Automated CSAM detection tools produce false positives. Innocent families sharing photos of their children, medical images, or artistic content have been flagged by automated systems, resulting in account suspensions and law enforcement investigations.
  • International precedent: If the EU adopts client-side scanning or mandatory age verification linked to identity, authoritarian regimes in Russia, China, Iran, and elsewhere will cite the EU’s model to justify their own surveillance requirements.

The Chat Control debate represents a fundamental conflict between two legitimate interests: protecting children from abuse and preserving the security and privacy infrastructure that protects everyone, including children, from surveillance, hacking, and authoritarian control. As of February 2026, the outcome remains uncertain.

EU-Level Surveillance Cooperation

Europol

Europol, headquartered in The Hague, supports and strengthens member state law enforcement cooperation in combating serious cross-border crime and terrorism. It operates under Regulation (EU) 2016/794, as significantly amended by Regulation (EU) 2022/991 (entered into force June 28, 2022), which considerably expanded its mandate.[43]

The 2022 amendments expanded Europol’s powers in several areas: large dataset processing, including data from individuals with no established link to criminal activity; enhanced cooperation with private parties such as technology companies and financial institutions; an expanded role in research and innovation, including processing personal data for training machine learning systems; and new crisis response provisions.[44]

These expansions generated significant criticism. The European Data Protection Supervisor (EDPS) stated that the amendments “weaken the fundamental right to data protection.” In January 2022, the EDPS had ordered Europol to delete datasets on individuals with no criminal link. The new Regulation’s Articles 74a and 74b effectively retroactively legalized this practice. In September 2022, the EDPS requested the CJEU to annul these provisions, arguing they threaten the rule of law and EDPS independence.[45]

The EU does not have a unified intelligence agency. Intelligence services remain under national sovereignty, and Europol’s mandate is limited to law enforcement cooperation; it has no independent arrest or investigation powers.[43]

EU-US Data Privacy Framework

The EU-US Data Privacy Framework (DPF) is the third attempt to create a legal mechanism for transatlantic commercial data transfers, following the invalidation of Safe Harbor (Schrems I, 2015) and Privacy Shield (Schrems II, 2020).

Its foundation is Executive Order 14086, signed by President Biden on October 7, 2022, requiring that US signals intelligence activities be “necessary” and “proportionate” and limited to twelve enumerated national security objectives. The EO also established the Data Protection Review Court (DPRC), an independent body within the US Department of Justice with at least six judges and two special advocates, empowered to review complaints from EU individuals and issue binding decisions on the intelligence community.[46]

On July 10, 2023, the European Commission adopted an adequacy decision for the US under the DPF, applying only to organizations that self-certify through the US Department of Commerce. The framework requires annual re-certification and provides multiple redress mechanisms including the DPRC for signals intelligence complaints.[47]

Schrems I – Invalidating Safe Harbor (2015)

In Case C-362/14, Austrian privacy activist Max Schrems challenged Facebook Ireland’s transfers of his data to the US, arguing that NSA mass surveillance programs (revealed by Edward Snowden) made adequate protection impossible. The CJEU found that US authorities could access transferred data on a generalized basis without effective judicial redress for EU individuals, and declared the Safe Harbor adequacy decision invalid, disrupting data transfers for approximately 4,500 companies.[48]

Schrems II – Invalidating Privacy Shield (2020)

In Case C-311/18, Schrems challenged Facebook’s use of Standard Contractual Clauses. The CJEU found that Privacy Shield did not include sufficient limitations on US government access, particularly through FISA Section 702 and Executive Order 12333, and that the Ombudsperson mechanism lacked independence and binding authority. Privacy Shield was declared invalid. SCCs remained valid in principle, but controllers must now conduct case-by-case transfer impact assessments and adopt supplementary measures where necessary.[49]

Legal Challenges to the DPF

In September 2025, the EU General Court dismissed a challenge by French MP Philippe Latombe, confirming the DPF’s validity. Latombe filed an appeal to the CJEU on October 31, 2025. Max Schrems and noyb (European Center for Digital Rights) have indicated they are considering a separate, broader challenge (a potential “Schrems III”) questioning whether the DPRC, created by executive order rather than statute, meets the EU Charter’s requirement for a tribunal “established by law.”[50]

Structural concerns have also emerged: the Privacy and Civil Liberties Oversight Board (PCLOB), a key US oversight body, was effectively gutted in early 2025 when a majority of its members were removed, leaving it without a quorum. Changes at the Federal Trade Commission, which has enforcement authority over DPF-certified organizations, have prompted questions about whether the adequacy decision might need to be suspended or revoked.[51]

noyb – European Center for Digital Rights

noyb (“none of your business”), mentioned above in connection with the DPF challenge, is a non-profit privacy enforcement organization founded by Max Schrems in 2017, headquartered in Vienna, Austria. It uses strategic litigation and regulatory complaints to enforce privacy rights under the GDPR and related EU legislation. Key activities include:[68]

101 Transfer Complaints: Following the Schrems II ruling described above, noyb filed 101 complaints across 30 EU/EEA countries against companies transferring data to the US via Google Analytics and Facebook Connect.

Chinese Tech Complaints: noyb filed GDPR complaints against TikTok, AliExpress, SHEIN, Temu, WeChat, and Xiaomi for transferring European users’ data to China without adequate protections, applying the same legal logic as the transatlantic transfer challenges.[69]

“Pay or OK” Campaigns: noyb has been challenging consent models where platforms offer users a choice between paying for an ad-free experience or consenting to tracking. In June 2025, noyb sued two German DPAs over their prolonged failure to act on complaints about these consent systems.[70]

noyb’s litigation has contributed to significant enforcement outcomes, including the Netflix €4.75 million fine by the Dutch DPA (December 2024) and multiple other major enforcement actions across Europe.[71]

Europol Intelligence Cooperation

Europol facilitates cooperation between member state law enforcement services through liaison officers from all EU member states stationed at Europol headquarters, cooperation agreements with non-EU states and international organizations (including Interpol), Joint Investigation Teams (JITs), the Counter Terrorism Centre (ECTC), and the European Cybercrime Centre (EC3). Europol’s information systems include the Europol Information System (EIS), operational analysis work files, and SIENA (Secure Information Exchange Network Application) for secure information exchange.[43]

Europol is supervised by the European Data Protection Supervisor (EDPS) rather than national DPAs for its own data processing activities. As noted above, the EDPS has been critical of Europol’s expanded mandate and has taken legal action challenging the 2022 amendments.[45]

Data Retention

The EU’s approach to telecommunications data retention has been shaped by a series of landmark CJEU judgments that have progressively restricted member states’ ability to require mass retention of communications metadata.

The Data Retention Directive (2006/24/EC)

Enacted in response to the 2004 Madrid and 2005 London bombings, the Data Retention Directive required telecoms and ISPs to retain metadata, including source and destination of communications, date, time, duration, type of communication, equipment identifiers, and cell tower location data, for between 6 and 24 months. It did not require retention of content.[52]

Digital Rights Ireland (2014)

In Joined Cases C-293/12 and C-594/12 (April 8, 2014), the CJEU declared the Data Retention Directive invalid in its entirety, the first time it had struck down an entire EU legislative act on fundamental rights grounds. The Court found the Directive constituted a “particularly serious” interference with Articles 7 and 8 of the Charter by requiring general and indiscriminate retention of data on practically the entire European population, without differentiation, limitation, or exception. It lacked clear rules governing the extent of interference, had insufficient safeguards for access, and imposed retention periods not limited to what was strictly necessary.[53]

Tele2 Sverige / Watson (2016)

In Joined Cases C-203/15 and C-698/15 (December 21, 2016), the CJEU confirmed the Digital Rights Ireland principle, holding that national legislation providing for general and indiscriminate retention of traffic and location data is incompatible with the ePrivacy Directive read in light of the Charter. Targeted retention may be permitted for fighting serious crime, provided it is limited in categories of data, means of communication, persons concerned, and retention period, all of which must be strictly necessary. Access must be subject to prior review by a court or independent body.[54]

La Quadrature du Net (2020)

In Joined Cases C-511/18, C-512/18, and C-520/18 (October 6, 2020), the CJEU addressed whether national security could justify general retention. It reiterated that general and indiscriminate retention is precluded even for national security purposes, but established limited exceptions: where a member state faces a serious, genuine, and present or foreseeable threat to national security, it may order general retention for a limited, renewable period subject to effective judicial review; general retention of IP addresses may be permitted for fighting serious crime; and a “quick freeze” targeted preservation order may be imposed for specific data with a sufficient link to serious crime.[55]

SpaceNet (2022)

In Joined Cases C-793/19 and C-794/19 (September 20, 2022), the CJEU struck down Germany’s data retention law, which required retention of all traffic and location data for ten weeks (four weeks for location data). The Court confirmed that even a relatively short retention period does not make general retention proportionate, reiterating that the IP address retention and quick freeze alternatives identified in La Quadrature du Net remain the permissible approaches.[56]

Current Status

No EU-wide mandatory data retention exists. The Data Retention Directive was invalidated in 2014 and has not been replaced. However, many member states maintain their own national data retention laws, which vary widely in scope, retention periods, and safeguards. Some have been amended following CJEU rulings; others remain legally uncertain and face ongoing legal challenges.[57]

The European Commission indicated it would conduct an impact assessment in Q1 2026 on the possibility of new EU-level data retention legislation. If the results support it, a legislative proposal could be presented in the second half of 2026. Most member states support future EU legislation in this area, though they disagree on scope: some emphasize the need to remain within CJEU case law limits, while others push for broader retention powers. The “quick freeze” mechanism endorsed by the CJEU in the case law described above has gained traction as a practical alternative to bulk retention, with the Council of the EU discussing implementation frameworks.[57]

European Data Protection Board (EDPB)

The EDPB is an independent EU body established under Article 68 of the GDPR, replacing the former Article 29 Working Party on January 25, 2018. It is composed of the heads of the 27 EU member state DPAs, the European Data Protection Supervisor, and (without voting rights) the European Commission and EEA/EFTA supervisory authorities. It is currently chaired by Anu Talus (Finland), elected May 2023.[58]

Key Functions

The EDPB issues guidelines, recommendations, and best practices on GDPR interpretation; ensures consistent application through the consistency mechanism described in the GDPR Enforcement section above; adopts binding decisions (Article 65) to resolve disputes between DPAs in cross-border cases; issues opinions on matters of general application; and provides opinions on Commission draft adequacy decisions.[58]

Key Guidelines

The EDPB and its predecessor have issued guidelines that significantly shape GDPR interpretation, covering: consent (Guidelines 05/2020), legitimate interests (Guidelines 1/2024), transparency (WP260), data protection officers (WP243), DPIAs (WP248), international transfers, right of access (Guidelines 01/2022), data breach notification (Guidelines 01/2021), targeting of social media users (Guidelines 08/2020), tracking technologies (October 2024), and processors and sub-processors (October 2024).[59]

Notable Binding Decisions

The EDPB has used its Article 65 binding decision power in several significant cross-border cases, notably directing the Irish DPC to increase fines and broaden the scope of violations found in cases against Meta and WhatsApp. These include the WhatsApp transparency, Meta “contract” basis, and Instagram children’s data cases listed in the Top GDPR Fines section above.[60]

The EDPB did not issue any new Article 65 binding decisions in 2024, while observing a sharp increase in requests for opinions under Article 64(2), reflecting a growing reliance on the EDPB’s guidance function.

The EDPB’s Strategy for 2024–2027 focuses on five priorities: strengthening enforcement of existing rules, supporting trustworthy innovation and new technologies (including AI), enhancing cooperation between DPAs, promoting effective data protection in the digital age, and strengthening the position of individuals. The EDPB Work Programme 2024–2025 implements this strategy.[61]

Adequacy Decisions

Under the Article 45 adequacy mechanism described in the GDPR International Data Transfers section above, there are as of February 2026 17 current adequacy decisions:[62]

Switzerland – July 26, 2000 (Decision 2000/518/EC; reaffirmed January 2024).

Canada – December 20, 2001 (Decision 2002/2/EC; commercial organizations subject to PIPEDA only; reaffirmed January 2024).

Argentina – June 30, 2003 (Decision 2003/490/EC; reaffirmed January 2024).

Guernsey – November 21, 2003 (Decision 2003/821/EC; reaffirmed January 2024).

Isle of Man – April 28, 2004 (Decision 2004/411/EC; reaffirmed January 2024).

Jersey – May 8, 2008 (Decision 2008/393/EC; reaffirmed January 2024).

Faroe Islands – March 5, 2010 (Decision 2010/146/EU; reaffirmed January 2024).

Andorra – October 19, 2010 (Decision 2010/625/EU; reaffirmed January 2024).

Israel – January 31, 2011 (Decision 2011/61/EU; reaffirmed January 2024; controversial due to surveillance concerns).[63]

Uruguay – August 21, 2012 (Decision 2012/484/EU; reaffirmed January 2024).

New Zealand – December 19, 2012 (Decision 2013/65/EU; reaffirmed January 2024).

Japan – January 23, 2019 (Decision 2019/419; first adequacy decision under the GDPR; mutual adequacy with Japan).

United Kingdom – June 28, 2021 (Decision 2021/1772; originally set to expire June 27, 2025; EDPB opinions adopted in 2025 on proposed extension to December 2031).[64]

Republic of Korea (South Korea) – December 17, 2021 (Decision 2022/254; covers commercial sector under PIPA).

United States – July 10, 2023 (Decision 2023/1795; via the EU-US Data Privacy Framework; applies only to DPF-certified organizations).

European Patent Organisation – July 15, 2025 (first adequacy decision granted to an international organization).[65]

Brazil – January 27, 2026 (most recent; mutual adequacy agreement finalized).[66]

On January 15, 2024, the European Commission published its first review of the eleven pre-GDPR adequacy decisions. All eleven were reaffirmed as providing adequate protection.[67]

Invalidated Adequacy Decisions

US – Safe Harbor (Decision 2000/520/EC) – Invalidated October 6, 2015 by the CJEU in Schrems I, as described in the EU-US Data Privacy Framework section above.[48]

US – Privacy Shield (Decision 2016/1250) – Invalidated July 16, 2020 by the CJEU in Schrems II, also described above.[49]

No other adequacy decisions have been revoked or invalidated to date. Notably, both invalidations resulted from CJEU judgments, not from Commission action; the Commission has never voluntarily revoked an adequacy decision.

Decisions Under Scrutiny

The United Kingdom adequacy decision faces concerns around UK divergence from GDPR standards under the Data (Use and Access) Act 2025 and the UK’s bulk surveillance powers under the Investigatory Powers Act 2016. The Israel decision has drawn calls for reassessment from European Parliament members and civil society groups due to mass surveillance practices and the 2023 expansion of Shin Bet surveillance powers.[63]

EU Privacy Framework Timeline

2000 – EU-US Safe Harbor adopted; Swiss adequacy decision.

2002 – ePrivacy Directive (2002/58/EC) adopted.

2006 – Data Retention Directive (2006/24/EC) adopted.

2009 – ePrivacy Directive amended (opt-in consent for cookies).

2014 – CJEU invalidates Data Retention Directive (Digital Rights Ireland).

2015 – CJEU invalidates Safe Harbor (Schrems I).

2016 – GDPR adopted (April); LED adopted (April); EU-US Privacy Shield adopted (July); Tele2/Watson judgment (December).

2017 – ePrivacy Regulation proposed (January); noyb founded.

2018 – GDPR and LED become applicable (May 25).

2019 – Japan adequacy decision (January); first major GDPR fine – Google €50 million (January).

2020 – CJEU invalidates Privacy Shield (Schrems II, July); La Quadrature du Net data retention judgment (October).

2021 – Amazon €746 million fine (July); WhatsApp €225 million fine (September); South Korea adequacy decision (December); UK adequacy decision (June).

2022 – DGA adopted (May); DSA and DMA adopted; Europol powers expanded (June); EO 14086 and DPRC established (October); SpaceNet judgment (September).

2023 – EU-US Data Privacy Framework adopted (July); Meta €1.2 billion record fine (May); DSA/DMA designations begin; Data Act adopted (December); DGA becomes applicable (September).

2024 – AI Act enters into force (August); DSA fully applicable (February); LinkedIn €310 million fine (October); Uber €290 million fine (August); Meta €251 million breach fine (December).

2025 – AI Act prohibited practices apply (February); GPAI obligations apply (August); Data Act applicable (September); ePrivacy Regulation withdrawn (February); TikTok €530 million fine (May); EPO adequacy decision (July).

2026 – AI Act high-risk requirements apply (August); Brazil adequacy decision (January); Commission data retention impact assessment (Q1).

2027 – AI Act fully applicable (August).

Sources

[72] EUR-Lex: Digital Services Act – Article 28(1) – Platforms accessible to minors must ensure high level of privacy, safety, and security
[73] European Commission: Guidelines on the Protection of Minors (July 14, 2025) – Final DSA Article 28 implementation guidance
[74] European Commission: Age Verification Blueprint (July 14, 2025) – Privacy-preserving solution for adult content access
[75] European Commission: Proposal for CSAM Regulation (May 11, 2022) – Chat Control 2.0 original proposal
[76] MEF: The End of End-to-end Encryption in Messaging? – Three options for providers: abandon encryption, backdoor, or client-side scanning
[78] EFF: After Years of Controversy, EU Chat Control Nears Final Hurdle (December 2025) – Companies would exit EU market rather than break encryption
[79] Wikipedia: Chat Control – June 2024 voting withdrawn following pushback
[80] The Register: Germany Slams Brakes on EU’s Chat Control (October 2025) – Germany leads blocking minority against mandatory encryption-breaking provisions
[81] EU.ci: Chat Control Regulation Consequences (November 2025) – Amended version passed November 26, 2025 with age verification
[82] Patrick Breyer: Chat Control – The EU’s CSAM Scanner Proposal – Zero evidence encryption caused increase in child abuse
[83] EUR-Lex: Council Regulation (EC) No 1334/2000 – Community regime for the control of exports of dual-use items and technology
[84] EUR-Lex: Regulation (EU) 2021/821 – Union regime for control of exports of dual-use items, Category 5 Part 2 (information security)
[85] Wassenaar Arrangement Official Website – Dual-Use List, 42 participating states, cryptographic items control categories
[86] Dechert: UK/EU Export Controls on Encryption Products – EU general export authorizations, trusted destinations, intra-EU liberalization
← Back to Privacy Law Directory