France
Nine Eyes member and home of Europe’s first data protection law
EU Member State: France is a member of the European Union and is therefore subject to the General Data Protection Regulation (GDPR), the Law Enforcement Directive, the ePrivacy Directive, and other EU-level data protection instruments. For a detailed treatment of the EU framework, see the EU Framework page. This page covers France’s national implementing legislation, domestic enforcement record, intelligence and surveillance laws, and its role in the Nine Eyes intelligence alliance.
Overview
France was the first European country to enact comprehensive data protection legislation when the Loi Informatique et Libertés was signed into law on January 6, 1978, six years before the United Kingdom’s Data Protection Act 1984 and nearly four decades before the GDPR.[1] The independent authority created by that law, the Commission nationale de l’informatique et des libertés (CNIL), became the model on which many other European data protection authorities were subsequently built.
France is also a member of the Nine Eyes intelligence alliance, maintaining a formal bilateral signals intelligence agreement with the United States National Security Agency codenamed Lustre.[2] Following the January 2015 Charlie Hebdo attack and the November 2015 Paris attacks, France enacted broad surveillance legislation that grants intelligence agencies some of the broadest surveillance powers in Western Europe, including algorithmic “black box” scanning of communications metadata, a capability France was the first EU country to authorize.[3]
The result is a country with active data protection enforcement on the civilian side (the CNIL imposed EUR 486.8 million in fines in 2025 alone[4]) alongside an intelligence apparatus that operates under significantly reduced oversight constraints, particularly for international surveillance. These two dimensions coexist within the same legal framework.
Data Protection Authority: CNIL
The Commission nationale de l’informatique et des libertés (CNIL) was established by the Loi Informatique et Libertés, making it one of the oldest data protection authorities in the world.[1] The CNIL is an independent administrative authority: ministers, public authorities, and company directors cannot oppose its actions, and its Chairman recruits staff independently.[5]
Structure and Resources
The CNIL is governed by a College of 18 members drawn from the judiciary, parliament, and qualified individuals. Operationally, it has grown substantially:
- 2024 budget: EUR 28.6 million
- 2025 budget: EUR 30.6 million (a 6.8% increase year-over-year)
- Five-year growth: Budget has increased 52% from EUR 18.8 million in 2019
- 2025 staff: 301 FTE positions (up from 189 in 2015), with 8 additional positions over 2024[6]
Enforcement Record
The CNIL’s enforcement activity has increased substantially in recent years. In 2024, the authority issued 331 corrective measures including 87 penalties totaling more than EUR 55 million in fines, processing 17,772 complaints.[7] In 2025, enforcement increased sharply to EUR 486.8 million in fines from 83 sanctions, nearly a nine-fold increase over the prior year.[4]
The CNIL was also responsible for issuing the first major GDPR fine by any EU data protection authority when it fined Google EUR 50 million in January 2019.[8]
Major Fines
| Date | Entity | Amount | Violation |
|---|---|---|---|
| January 2019 | Google LLC | EUR 50 million | Lack of transparency, inadequate consent for ad personalization (first major GDPR fine by any EU DPA)[8] |
| December 2020 | Google LLC / Google Ireland | EUR 100 million | Cookie consent violations[9] |
| December 2021 | Google LLC / Google Ireland | EUR 150 million | Failing to let users reject cookies as easily as accepting them[9] |
| December 2021 | Facebook Ireland | EUR 60 million | Cookie consent violations[9] |
| January 2023 | TikTok | EUR 5 million | Manipulative cookie-consent flow making refusal harder than acceptance[10] |
| June 2023 | Criteo | EUR 40 million | Failure to verify consent collection by advertising partners[11] |
| January 2024 | Amazon France Logistique | EUR 32 million | Employee surveillance violations |
| September 2025 | Google LLC / Google Ireland | EUR 325 million | Displaying ads in Gmail without consent, invalid consent at account creation[12] |
| September 2025 | Shein (Infinite Styles Services) | EUR 150 million | Pre-loading cookies before user consent, cookies persisting after opt-out |
Google alone has been fined a combined EUR 625 million by the CNIL across four separate enforcement actions. The CNIL’s consistent focus on cookie consent and advertising transparency has resulted in numerous enforcement actions related to adtech compliance.
National Framework
Loi Informatique et Libertés (Law 78-17, as amended)
The Loi Informatique et Libertés was substantially amended to integrate the GDPR (as described below). It created the CNIL and established comprehensive rules governing the automated processing of personal data at a time when most countries had no such legislation.[1]
The law has been substantially amended to integrate the GDPR, through a multi-stage process:
- June 20, 2018: Law modifying the 1978 Act to integrate GDPR Member State derogations
- December 12, 2018: Ordinance rewriting the entire Informatique et Libertés law
- May 29, 2019: Implementing decree completing the transposition process
- June 1, 2019: New version entered into force[13]
The adapted law preserves the CNIL’s role as the national supervisory authority, maintains French-specific provisions on sensitive data processing, and implements GDPR derogations concerning journalism, academic expression, and archiving.
Digital Republic Act (Loi pour une République numérique, Law 2016-1321 of October 7, 2016)
Proposed by Secretary of State for Digital Affairs Axelle Lemaire, this law introduced several additional provisions:[14]
- Open data by default: Public sector data must be made openly available
- Net neutrality: Enshrined in French law
- Platform loyalty obligations: Online platforms must act in good faith toward users
- Data portability: Users can transfer their data when switching online services
- Open access for research: Authors of publicly funded scientific work (50%+ public funding) may make their work openly accessible after embargo periods: 6 months for STEM fields, 12 months for humanities and social sciences
- Free software in government: Article 16 encourages the use of free software and open formats in public administration
Surveillance and Intelligence
Intelligence Act 2015 (Loi relative au renseignement, July 24, 2015)
Passed in the wake of the January 2015 Charlie Hebdo attack, this law created the first comprehensive legal framework for French intelligence surveillance, codified in the Code of Internal Security (Code de la sécurité intérieure).[3]
Authorized surveillance purposes: National defense; foreign policy interests; economic, industrial, and scientific interests; prevention of terrorism; prevention of organized crime; and prevention of proliferation of weapons of mass destruction.
Surveillance techniques authorized: Targeted interception, real-time geolocation, IMSI catchers, placement of microphones and cameras in private spaces, computer intrusion and hacking, and bulk metadata analysis.
Algorithmic surveillance (“black boxes”): Intelligence agencies may install automated devices at data centers and telecommunications networks to detect communications patterns matching selectors tied to terrorist activity. As noted above, France was the first EU country to authorize this form of automated algorithmic scanning of communications metadata.[3] The CNCTR’s 2023 annual report noted an increasing use of the most intrusive techniques, including microphones in private places, collection of all computer data, and phone and computer hacking.[15]
CNCTR (Commission nationale de contrôle des techniques de renseignement)
The CNCTR replaced the former CNCIS as the independent oversight body for intelligence surveillance techniques. It is composed of magistrates from the Council of State and Court of Cassation, qualified individuals, and parliamentary representatives.[15]
Powers and limitations: The CNCTR provides prior opinions before authorization of surveillance techniques, but these opinions are not binding. The Prime Minister may override a CNCTR opinion with written justification. The commission also conducts post-hoc controls of ongoing surveillance operations. This advisory-only model means that the final decision on intrusive surveillance rests with the executive branch, not with an independent body.
International Electronic Communications Law (November 30, 2015)
A separate law passed alongside the Intelligence Act governs surveillance of international communications, defined as communications “emitted from or received abroad.” This law carries significantly reduced oversight compared to domestic surveillance:[16]
- The prior opinion of the CNCTR is not required for international surveillance
- International communications may be intercepted and exploited in bulk on French territory
- Retention periods: Content may be retained for 1 year after first exploitation, within a limit of 4 years after collection (8 years for encrypted content). Metadata may be retained for 6 years after collection
- Codified in Articles L.854-1 and following of the Code of Internal Security
The distinction between “domestic” and “international” communications carries practical significance: in an era of global internet routing, a substantial volume of communications between two people inside France may transit through foreign servers and thereby qualify as “international” under this framework.
SILT Law 2017 (Loi renforçant la sécurité intérieure et la lutte contre le terrorisme, November 1, 2017)
The SILT Law was enacted to formally end the state of emergency that had been in effect since the November 2015 Paris attacks, while permanently incorporating key emergency powers into ordinary law. Critics describe it as “making the state of emergency permanent.”[17]
Powers made permanent:
- Perimeters of protection: Administrative authorities can establish secure perimeters around events or locations
- Administrative closure of places of worship: Authority to shut down religious venues deemed to incite terrorism
- House visits: Police may conduct searches of private residences with judicial authorization (formerly “administrative searches”)
- MICAS (Mesures individuelles de contrôle administratif et de surveillance): Formerly “house arrest.” Authorities can restrict individuals to a municipality, require daily police check-ins, confiscate passports, and impose electronic bracelet monitoring for up to one year. The state is not required to disclose evidence to the affected person[17]
These measures were initially subject to a sunset clause requiring parliamentary renewal. They have been repeatedly renewed, and the push toward full permanence continues.[18]
Intelligence Agencies
DGSI (Direction générale de la sécurité intérieure)
France’s domestic intelligence service, responsible for counterterrorism, counterintelligence, counter-subversion, and economic protection within French territory. The DGSI operates under the Ministry of the Interior and serves as the primary agency for internal security threats.
DGSE (Direction générale de la sécurité extérieure)
France’s foreign intelligence service, responsible for intelligence collection, covert operations, and signals intelligence outside French borders. The DGSE contains a technical directorate (Direction Technique, or DT) that handles signals intelligence, internally designated ROEM (Renseignement d’Origine Électromagnétique). The DT operates a network of CRE (Centre de Renseignement Électronique) stations overseas, providing France with independent SIGINT collection capabilities spanning the Mediterranean, Africa, and the Middle East.[19]
Nine Eyes: Third-Party Status
France is a core member of the Nine Eyes intelligence alliance. The DGSE maintains a formal bilateral SIGINT agreement with the NSA under the codename Lustre.[2] As a “3rd Party” partner under the UKUSA framework:
- France and the Five Eyes exchange raw SIGINT data (not just finished intelligence reports)
- However, unlike Five Eyes “2nd Party” members, 3rd Party partners like France are not automatically exempt from being targeted by NSA intelligence collection
- An internal NSA document states: “The NSA can, and often do, target the signals of most 3rd party foreign partners”[20]
This means that while France shares raw signals intelligence with the United States, French communications, including those of government officials, may simultaneously be collected and analyzed by the NSA. The dual nature of this relationship was underscored by revelations during the Danish Operation Dunhammer scandal, which confirmed that NSA surveillance of European allies included senior French officials.[21]
Commercial Surveillance Procurement
Despite operating a sophisticated intelligence infrastructure through the DGSI and DGSE, France has supplemented its domestic capabilities with commercial surveillance technologies from private vendors. These procurement decisions reveal the practical limitations of even well-resourced intelligence services and the global market for surveillance tools that operate outside traditional intelligence frameworks.
Palantir Technologies and DGSI
In 2025, France’s domestic intelligence agency DGSI renewed its contract with Palantir Technologies for data analytics and intelligence platform services. The contract provides pattern-matching and link-analysis capabilities across counterterrorism and counterintelligence datasets.[35]
The use of Palantir by DGSI raises questions about data sovereignty, particularly given Palantir’s close ties to US intelligence agencies and its obligations under the US CLOUD Act. While DGSI operations are governed by the Intelligence Act 2015 and subject to CNCTR oversight, the introduction of a US-based commercial analytics layer creates potential extraterritorial access pathways to French intelligence holdings that did not exist when all capabilities were domestically developed and hosted.
NSO Group Pegasus: Attempted Procurement and Scandal
France’s relationship with Israeli spyware vendor NSO Group has been marked by controversy. French intelligence and law enforcement agencies attempted to procure NSO’s Pegasus spyware, a sophisticated mobile device exploitation tool capable of remotely accessing encrypted communications, activating cameras and microphones, and extracting all data from targeted smartphones.
The procurement faced intense scrutiny after revelations that Pegasus had been used to target French journalists, lawyers, and politicians. In July 2021, the Forbidden Stories consortium revealed that French President Emmanuel Macron’s phone number appeared on a list of potential Pegasus targets, allegedly selected by Morocco’s intelligence services. The revelations triggered a diplomatic incident and forced France to confront the fact that the same surveillance tools it sought to purchase for its own intelligence operations were being used against French officials by foreign governments.[36]
Following the scandal, France suspended further Pegasus procurement discussions, though it did not immediately disclose the full extent of its prior use of NSO technologies. The episode illustrates a fundamental problem with the commercial spyware market: tools purchased for legitimate intelligence purposes create vulnerabilities when the same vendor sells to dozens of governments with varying human rights records and geopolitical interests.
The Oversight Gap
Commercial surveillance procurement in France operates in a regulatory gray zone. When DGSI or DGSE directly conducts surveillance using statutory powers under the Intelligence Act 2015, those operations require Prime Ministerial authorization and CNCTR review. But when agencies purchase analytics platforms or spyware from commercial vendors, those contracts are treated as ordinary procurement decisions subject only to standard government contracting rules.
This creates a structural asymmetry: intelligence collection using French-developed capabilities faces oversight by the CNCTR and periodic review by the Council of State, while commercially procured tools (Palantir analytics, NSO spyware, and similar technologies) enter the intelligence apparatus through procurement processes that lack equivalent safeguards. The result is that commercially procured tools can circumvent the transparency and accountability mechanisms France established after the 2015 surveillance law reforms.
France’s experience with Pegasus demonstrates the risks: intelligence agencies can find themselves simultaneously using and being targeted by the same commercial surveillance tools, with limited visibility into who else has purchased access and how those capabilities are being deployed.
Cryptography Regulation
France historically maintained strict cryptography controls, treating strong encryption as a dual-use munition subject to export licensing and domestic authorization. While the country has liberalized use restrictions over the past two decades, significant legal powers remain for compelling decryption and intercepting encrypted communications.[31]
Current Legal Framework
Article 29 of Law 2004-575 (June 21, 2004) governs the regulation of cryptology in France. While the use of encryption is now unrestricted for individuals and businesses, the supply, import, and export of cryptographic products remain subject to administrative controls:[32]
- Domestic supply and provision: Cryptography services and products may be freely supplied within France without prior authorization
- Import and intra-EU transfer: Import from outside the EU and transfer within the EU require declaration to the Agence nationale de la sécurité des systèmes d’information (ANSSI) for products exceeding certain key-length thresholds
- Export: Export outside the EU remains subject to licensing under France’s implementation of the Wassenaar Arrangement and EU dual-use export controls
Compelled Decryption and Key Disclosure
France’s legal framework provides several mechanisms through which authorities can compel access to encrypted communications:
Judicial decryption orders: Under the Code of Criminal Procedure, investigating judges may order individuals or service providers to decrypt data or provide encryption keys. Refusal to comply can result in criminal penalties of up to three years’ imprisonment and EUR 270,000 in fines (higher penalties apply if the refusal obstructs an investigation into organized crime or terrorism).[33]
Intelligence Act provisions: The Intelligence Act 2015 grants intelligence agencies access to encrypted communications metadata and, where authorized by the Prime Minister on the recommendation of the Commission de contrôle des techniques de renseignement (CNCTR), allows real-time interception, though this does not grant automatic authority to compel service providers to design backdoors into end-to-end encryption systems.
The "Ghost Participant" Proposal (2025) – Rejected
In March 2025, the French National Assembly rejected a controversial Interior Ministry proposal that would have forced messaging platforms like Signal and WhatsApp to allow law enforcement to join encrypted group chats as hidden participants. The proposal would have enabled authorities to read messages in real-time without the knowledge or consent of chat participants, a mechanism critics labeled a “ghost participant” backdoor.[34]
The Interior Ministry argued the measure was necessary to combat organized crime and terrorism in an era of ubiquitous end-to-end encryption. Civil liberties organizations, cryptography experts, and technology companies countered that:
- The proposal would require messaging platforms to fundamentally alter their encryption architectures, creating systemic vulnerabilities exploitable by malicious actors
- “Ghost participant” mechanisms would undermine user trust in secure communications and likely drive criminal actors to non-compliant platforms
- France’s existing legal authorities, including device-based surveillance via spyware, already provide lawful access to encrypted communications content
The National Assembly’s rejection of the ghost participant proposal was welcomed by encryption advocates and digital rights organizations, particularly in contrast to the United Kingdom’s contemporaneous use of Technical Capability Notices to block Apple’s Advanced Data Protection in the UK.[34]
International Context
France’s approach to encryption regulation reflects a tension between its historical cryptography controls, its role as a major technology market within the EU, and the going dark concerns articulated by law enforcement and intelligence agencies. The rejection of the ghost participant proposal suggests that, as of 2025, France remains unwilling to mandate structural backdoors, though authorities retain significant legal powers to compel decryption on a case-by-case basis.
Data Retention
The French Council of State (Conseil d’État) has ruled that the existing national security threat in France justifies generalized retention of connection data by ISPs and telecommunications providers. However, the court ordered the government to regularly reassess the threat justification and to submit intelligence use of retained data to clearance by an independent authority.[22]
The Council of State’s position attempts to balance the CJEU’s restrictions (which hold that general, indiscriminate data retention is only permissible in response to serious national security threats) with French operational needs. The practical effect is to validate France’s broad retention regime, subject to periodic government reassessment of the underlying threat. ISPs and telecommunications providers must retain traffic and location data under this framework.
The requirement for “periodic reassessment” provides a formal constraint, but the national security justification has been continuously reaffirmed since the 2015 attacks, and no reassessment has resulted in a reduction of retention obligations.
Age Verification for Pornographic Content
France has implemented a strict age verification regime for online pornography. In May 2024, Law No. 2024-449 aimed at securing and regulating the digital space strengthened the existing system and empowered Arcom (France’s audiovisual and digital communication regulator) to adopt technical standards for age verification systems.[27]
Mandatory Age Verification (Effective January 11, 2025)
In October 2024, Arcom announced new rules for adult operators and platforms with pornographic content. The rules came into effect on January 11, 2025, with platforms given a transition period to implement suitable age verification solutions ending on April 11, 2025.[28]
The order mandates that any platform offering pornographic content in France must implement robust, privacy-respecting age checks. Acceptable methods include digital ID verification, biometric analysis, or document checks that go beyond simple self-declarations (clicking “I am 18” boxes). Simple age declarations are explicitly considered insufficient.
Double Anonymity Principle
France’s age verification standard incorporates a “double anonymity” privacy principle:[29]
- The pornography website does not know the user’s identity
- The age verification provider does not know which sites the user visits
Platforms must offer at least one age verification method that complies with the double anonymity principle, with mandatory compliance from April 11, 2025. This approach attempts to balance child protection with privacy rights by preventing the creation of centralized databases linking identities to pornography consumption.
Enforcement and Geographic Scope
Arcom has enforcement authority to impose penalties for non-compliance of up to €150,000 or 2% of worldwide turnover (excluding VAT), whichever is higher.
In February 2025, an order extended the law’s scope to include 17 services offering pornographic content based in various EU member states, asserting France’s jurisdiction over non-French platforms accessible to French users.[30]
Criticism and Privacy Concerns
While the double anonymity principle addresses some privacy concerns, digital rights organizations have questioned whether age verification can be truly privacy-preserving at scale. Critics note that any centralized age verification system, even with anonymization, creates metadata that could potentially be exploited by hackers, authoritarian regimes, or future governments. The requirement to verify age necessarily creates a record that a particular individual accessed age-restricted content at a particular time, even if the specific site is not logged.
France’s enforcement approach has positioned it among the most active EU member states in age verification policy.
International Data Sharing Agreements
Despite France’s active GDPR enforcement and French blocking statutes designed to protect against unilateral foreign law enforcement requests, France participates in extensive international data sharing frameworks that provide foreign agencies with pathways to access French person data, often through processes that operate outside CNIL oversight and French judicial review.
Mutual Legal Assistance Treaty with the United States
France maintains a bilateral MLAT with the United States, with the Ministry of Justice serving as the central authority for processing requests. The MLAT allows French law enforcement to request data on US persons, and US law enforcement to request data on French persons, through diplomatic channels with average processing times of 10 months.[37]
Nine Eyes Intelligence Sharing
France is a member of the Nine Eyes intelligence alliance, an expansion of the original Five Eyes (US, UK, Canada, Australia, New Zealand) that includes Denmark, France, Netherlands, and Norway. As a Nine Eyes member, French intelligence services share signals intelligence (SIGINT) with Five Eyes partners, though with less privileged access than the core Five Eyes members.[38]
The Nine Eyes framework creates a reciprocal surveillance bypass: French intelligence services can collect data on US, UK, or other partner nations’ persons and share it with those countries’ agencies, while NSA, GCHQ, and other Five Eyes agencies can collect on French persons and share with French intelligence. According to Privacy International, data collected via intelligence sharing programs can be shared with law enforcement, potentially bypassing CNIL oversight and French judicial warrant requirements.
EU Law Enforcement Data Sharing Frameworks
Schengen Information System (SIS II): France participates in the EU’s largest law enforcement database, processing hundreds of thousands of queries daily on wanted persons, missing individuals, and objects across all Schengen countries. French police can query SIS II in real time and contribute alerts visible to law enforcement across the Schengen zone.
European Investigation Order (EIO): France participates in the EIO framework, which allows French judges and magistrates to make binding requests to other EU member states for evidence, witness hearings, telephone interceptions, banking information, and other investigative measures. The EIO is based on mutual recognition, and executing authorities are obliged to recognize and execute requests.
Prüm Convention: France was an original signatory of the Prüm Convention (2005) and participates in automated DNA, fingerprint, and vehicle registration data comparison across EU member states. The Prüm II Regulation (2024) expands this to include facial images and police records.
EU-US Data Sharing Frameworks
EU-US Umbrella Agreement: Entered into force February 1, 2017, this agreement governs personal data exchanged between EU and US law enforcement for criminal investigations. It grants French citizens equal treatment with US citizens for judicial redress rights before US courts.
SWIFT/TFTP Agreement: Under the Terrorist Finance Tracking Program, the US Treasury can subpoena SWIFT for financial transaction data, with Europol verification. This affects French persons’ international wire transfers and financial messaging data.
PNR Agreements: France participates in the EU-US PNR agreement, enabling transfer of passenger data from French air carriers to US Customs and Border Protection. Every passenger on France-US flights has comprehensive personal data (name, itinerary, payment, contacts) shared with US authorities.
French Blocking Statutes: Limited Protection
France maintains national-level blocking statutes (modernized 2022) designed to protect against unilateral US law enforcement requests, reflecting sovereign sensitivities about foreign data demands. However, these blocking statutes apply only to unilateral foreign requests; they do not prevent data sharing through MLATs, the EU-US Umbrella Agreement, Nine Eyes intelligence channels, or EU frameworks like SIS II and Prüm.[39]
The result is that blocking statutes provide limited protection in practice: While they may prevent a US prosecutor from directly compelling a French company to produce data, the US can request the same data via MLAT (French Ministry of Justice processes the request), the EU-US Umbrella Agreement (law enforcement cooperation), SWIFT/TFTP (financial data), PNR agreements (passenger data), or intelligence sharing (Nine Eyes channels).
Multilateral Frameworks
Interpol I-24/7: France participates in Interpol’s global information sharing network (195 countries, 100,000+ messages daily) for Red/Blue notices, biometric data, and criminal intelligence.
Egmont Group: Tracfin (French FIU) participates in the Egmont Group network of 164+ Financial Intelligence Units, sharing financial intelligence on money laundering and terrorist financing.
Europol: French law enforcement participates in Europol data sharing, which includes cooperation agreements with the US FBI. Intelligence sharing between Europol and FBI increased 30% in recent years, creating an additional pathway for French person data to flow to US authorities.
The Privacy Backdoor Effect
Despite GDPR protections enforced by CNIL, French blocking statutes, and French judicial warrant requirements, international data sharing agreements create alternative pathways for accessing French person data:
- Nine Eyes Laundering: NSA/GCHQ can collect on French persons and share with French intelligence, bypassing French judicial oversight; French intelligence can collect on US/UK persons and share with partner agencies
- MLAT Bypass: US authorities can request data via MLAT with Ministry of Justice processing, circumventing blocking statutes and potentially involving lower evidentiary standards than French judicial warrants
- EU Framework Sharing: French person data entered into SIS II, Prüm, or EIO channels becomes accessible to 27 EU member states, and through Europol cooperation, potentially to US FBI
- SWIFT/PNR Dragnet: All international financial transactions and air travel subject to US access via TFTP and PNR agreements
For French persons, this means data nominally protected by GDPR, CNIL enforcement, and French judicial oversight can be accessed through MLAT channels, Nine Eyes intelligence sharing, EU law enforcement frameworks (SIS II, EIO, Prüm, Europol), SWIFT/TFTP financial surveillance, or PNR passenger data agreements. The blocking statutes, while symbolically important for French sovereignty, provide limited practical protection against the web of multilateral data sharing frameworks to which France is a party.
Recent Developments
Algorithmic Video Surveillance for the 2024 Olympics
In March 2023, France passed Article 10 of the Olympic and Paralympic Games Law, authorizing experimental algorithmic video surveillance (AVS) at all sporting, recreational, and cultural events with 300 or more participants. The authorization ran until March 31, 2025.[23]
The AVS systems analyze CCTV feeds for suspicious behaviors (abandoned objects, unusual crowd movements, atypical trajectories) but officially exclude biometric identification and facial recognition. Critics argue that the systems necessarily capture “physiological features and behaviours” that constitute biometric data under GDPR definitions.[24]
Push for Permanence
After the Olympics, Paris police chief Laurent Nuñez publicly favored extending the AVS authorization. A bill from the Républicains party proposes a three-year extension. A separate transport security law would extend algorithmic analysis of public transport CCTV until 2027.[24]
The CNIL has warned against a “ratchet effect”, the tendency for temporary surveillance measures to become permanent without adequate democratic debate. The authority has called for a proper assessment of the Olympic experiment’s results before any extension is considered.[25]
CNIL Enforcement Trends
The increase from EUR 55 million in fines in 2024 to EUR 486.8 million in 2025 reflects a significant escalation in CNIL enforcement. The authority’s 2025 strategic plan emphasizes AI, Big Tech, and cookie compliance as priority enforcement areas.[26]
CNIL 2025 Enforcement Breakdown
Google EUR 325 million (September 2025): The CNIL’s largest-ever single fine, imposed on Google LLC and Google Ireland for displaying ads between emails in Gmail without consent and placing cookies during Google account creation without valid consent. The case originated from a 2022 complaint by NOYB and affected over 74 million accounts, with 53 million users seeing unauthorized advertisements in Gmail’s “Promotions” and “Social” tabs.[40]
Shein EUR 150 million (September 2025): Infinite Styles Services (Shein’s Irish subsidiary) was fined for pre-loading advertising cookies on visitors’ devices before any consent was given and for continuing to place and read cookies even after users clicked “Refuse all.” With an average of 12 million monthly French visitors, the scale of non-compliance was a key factor in the penalty.[41]
Free/Free Mobile EUR 42 million (January 2026): The CNIL fined Free Mobile (EUR 27 million) and Free (EUR 15 million) for inadequate security measures that led to an October 2024 breach affecting 24 million subscriber contracts, including IBANs. The restricted committee found that authentication procedures for VPN access were not sufficiently robust and that anomaly detection measures were ineffective. Free Mobile was also cited for retaining subscriber data without justification for an excessive period.[42]
France Travail EUR 5 million (January 2026): France Travail (formerly Pôle Emploi) was fined for failing to secure job seekers’ data, after a social engineering attack in early 2024 compromised the personal data of approximately 43 million individuals, including national insurance numbers, email and postal addresses, and telephone numbers. The CNIL found that authentication procedures allowing partner organization Cap Emploi advisers to access France Travail systems were not sufficiently robust.[43]
Narcotrafficking Law (Law 2025-532 of June 13, 2025)
Enacted June 13, 2025, the loi visant à sortir la France du piège du narcotrafic significantly expands French surveillance capabilities under the banner of combating organized drug trafficking. The law extends the use of algorithmic “black boxes” (previously limited to counterterrorism under the Intelligence Act 2015) to drug trafficking, arms trafficking, explosives trafficking, and associated money laundering investigations. Until December 31, 2028, intelligence services may experiment with algorithmic techniques to detect internet activity associated with these threats. The law also extends algorithmic video surveillance (AVS) to 2027. An earlier version of the bill included a controversial provision that would have forced encrypted messaging platforms to provide law enforcement with decrypted messages within 72 hours; this encryption backdoor was rejected by the National Assembly, though the broader surveillance expansions remained in the final text as promulgated.[44][45]
EU AI Act Implementation Delays
France missed the August 2, 2025 deadline to designate national competent authorities under the EU AI Act. In September 2025, the Directorate-General for Enterprises (DGE) published a draft proposing 17 market surveillance authorities, with the CNIL responsible for overseeing 15 AI use cases. However, as of early 2026, the designations have not been formally adopted. Provisions to formalize the designations were initially included in the DDADUE bill (November 2025) but were later withdrawn before parliamentary submission. France remains one of multiple EU member states that failed to meet the designation deadline.[46]
NIS2: Loi Résilience
France has not yet adopted its transposition of the NIS2 Directive. The Loi relative à la résilience des infrastructures critiques et au renforcement de la cybersécurité (Loi Résilience) was presented to the Council of Ministers in October 2024 and adopted by the Senate in March 2025, but remained under consideration in the National Assembly as of early 2026. In May 2025, the European Commission sent France a reasoned opinion (the second stage of infringement proceedings) for failure to notify full transposition, alongside 18 other member states. ANSSI will serve as the central enforcement authority, and the framework is expected to bring approximately 15,000 entities into scope, including roughly 2,000 classified as essential entities. Final promulgation is expected in Q1 2026, with technical decrees from ANSSI to follow.[47]
CJEU Challenge to French Age Verification
In March 2024, the Conseil d’État referred a preliminary question to the CJEU regarding whether France’s SREN law age verification requirements can be applied to pornographic websites established in other EU member states. The case turns on the e-Commerce Directive’s country-of-origin principle, which generally prevents member states from imposing requirements on services established in another member state. In September 2025, the Advocate General published an opinion concluding that France’s approach may be incompatible with EU law unless the European Commission approves. A final CJEU ruling, expected in 2026, could reshape the permissibility of national age verification mandates across the EU.[48]
Chat Control (EU CSAM Regulation)
France has generally supported the EU’s proposed regulation to combat child sexual abuse material (CSAM), commonly known as “Chat Control.” On November 26, 2025, the Council of the EU adopted a general approach that dropped the requirement for mandatory automated scanning of encrypted communications but preserved the ability for member states to require “high-risk” services to implement additional mitigation measures. The compromise text moved to trilogue negotiations with the European Parliament in early 2026. Critics warn that the “voluntary” framing could still create indirect pressure on platforms to implement scanning systems that undermine end-to-end encryption.[49]
DDADUE Law: Unified Collective Redress (April 30, 2025)
Law No. 2025-391 of April 30, 2025, known as the DDADUE law, created a unified class action regime in France, transposing EU Directive 2020/1828 on representative actions. Previously, French class actions were limited to specific sectors; the new regime allows collective actions against any company on any legal ground, provided victims are in a similar situation and suffered from the same breach. A wider range of organizations, including approved associations and non-profit entities, may now bring class actions without requiring prior approval. The law applies to all class actions initiated from May 2, 2025 onward.[50]
CNIL AI Guidelines (2025)
Throughout 2025, the CNIL published a series of practical guidelines on artificial intelligence and GDPR compliance. In February 2025, two guidelines addressed informing data subjects and respecting their rights in the context of AI models. In June 2025, two further recommendations covered reliance on legitimate interest as a legal basis for AI model development and measures for personal data collection via web scraping. In July 2025, three additional guidance documents addressed annotation of training data, security during AI system development, and the GDPR status of AI models. The guidance positions the CNIL as one of the most active European DPAs in providing practical AI compliance frameworks ahead of full EU AI Act implementation.[51]
DSA Enforcement Framework (2025)
France’s Digital Services Act (DSA) enforcement framework became fully operational in early 2025. Arcom serves as the Digital Services Coordinator, with shared competence between three authorities: Arcom (overall coordination and investigation), the CNIL (advertising transparency, profiling prohibitions, and minor protection provisions), and the DGCCRF (consumer protection supervision, formally designated January 3, 2025). A December 2024 decree specified Arcom’s investigation powers, including on-site inspections and judicial injunctions against non-compliant service providers. A cooperation agreement between the three authorities governs the division of competences.[52]
CNIL 2024 Annual Report
The CNIL’s 2024 annual report documented a record 17,772 complaints received (telecoms, web, and social networks accounting for 49%), 5,629 data breach notifications (a 20% increase over 2023, with breaches affecting more than one million people doubling year-over-year), and 303 corrective measures including 87 sanctions totaling over EUR 55 million. One-third of sanctions concerned security failings. The CNIL also published its first 12 practical AI recommendation sheets and carried out 173 awareness-raising actions, including 84 focused on minors.[53]
EU Data Act (Applicable September 12, 2025)
The EU Data Act became applicable on September 12, 2025, imposing new data access and sharing obligations on manufacturers and service providers of connected (IoT) products in France. Users of connected devices must now be given free, secure access to the data they generate, and data holders must facilitate third-party access upon user request. France’s SREN law (Law No. 2024-449) had already anticipated some Data Act obligations by establishing cloud computing interoperability, portability, and functional equivalence requirements.[54]
Telemarketing Consent Law (June 30, 2025)
On June 30, 2025, France adopted a law shifting telemarketing regulation from an opt-out model to an opt-in model. Effective August 11, 2026, all outbound commercial prospecting calls to personal phone numbers will be prohibited unless the recipient has given explicit prior consent that is free, specific, informed, unambiguous, and revocable. The law ends the Bloctel opt-out registry system. Maximum penalties for violations rise to EUR 500,000 or up to 20% of annual turnover, with criminal penalties possible for grave offenses.[55]
