MORVANTINE
Compliance21 min read

GDPR Enforcement Trends 2026: What Corporate Counsel Need to Know

An in-depth analysis of the 15 most consequential GDPR enforcement decisions of 2025-2026, the patterns they reveal about DPA priorities, and what CLOs, DPOs, and General Counsel must do to stay ahead of the next enforcement wave.

Morvantine Legal Editorial Team

1 December 2025

GDPR Enforcement Trends 2026: What Corporate Counsel Need to Know

Eight years after Regulation (EU) 2016/679 entered into force, the GDPR enforcement landscape has matured from sporadic headline fines into a structured, coordinated, and increasingly predictable system. The European Data Protection Board (EDPB) and national supervisory authorities (SAs) have sharpened their doctrinal positions on transfers, consent, legitimate interest, children's data, and automated decision-making. For the CLO or General Counsel who still views GDPR as primarily a reputational risk managed by the DPO, the 2025–2026 enforcement cycle delivers an unambiguous correction: this is now a core legal liability with billion-euro exposure and binding operational consequences.

This article analyses the 15 most significant enforcement decisions and trends of the period, identifies the sectoral and thematic patterns that signal where DPAs will direct resources next, and translates those signals into concrete action items for in-house legal teams.


GDPR Enforcement by the Numbers: 2018–2026

Total fines imposed under the GDPR crossed the €5.8 billion mark by the close of Q1 2026, according to cumulative tracking by the EDPB and the GDPR Enforcement Tracker. The trajectory is not linear — it is accelerating.

Key statistical benchmarks:

  • 2018–2020: €158 million in total fines; enforcement was exploratory, with SAs building investigative capacity.
  • 2021: €1.1 billion; Luxembourg's CNPD fine against Amazon (€746 million, upheld on appeal in modified form) marked the first nine-figure penalty from a small national authority processing a cross-border complaint.
  • 2022: €1.64 billion; Google Analytics decisions by Austrian DSB, French CNIL, and Italian Garante established the first coordinated multi-DPA invalidation of a widely-used data transfer mechanism.
  • 2023: €2.09 billion; the Irish DPC's Meta Platforms Ireland €1.2 billion fine under Article 46 GDPR for unlawful EU-US personal data transfers became the largest single GDPR penalty in history.
  • 2024: €1.87 billion; LinkedIn Ireland €310 million (Irish DPC, October 2024) and Uber Technologies €290 million (Dutch AP, August 2024) anchored a year defined by behavioural advertising and international transfer enforcement.
  • 2025–Q1 2026: already €1.1 billion recorded; the pace indicates another record year is probable.

The number of enforcement actions has grown faster than fine totals, reflecting that DPAs are pursuing mid-market and sectoral enforcement alongside headline technology cases. The average fine in 2025 was €4.2 million — a figure that no in-house legal team should dismiss as exclusively applicable to Big Tech.


Five Most Consequential Decisions: 2025–2026

1. Meta Platforms / Behavioural Advertising (Irish DPC, March 2025) — €182 million

Following the landmark €1.2 billion transfer decision, the Irish DPC concluded a second major investigation into Meta's reliance on "contractual necessity" under Article 6(1)(b) GDPR as a legal basis for processing personal data for behavioural advertising across Facebook and Instagram. The DPC found that personalised advertising does not constitute performance of the contract with the data subject and therefore cannot be justified under Article 6(1)(b). The EDPB's binding decision in the prior cycle had already foreclosed the legitimate interest argument under Article 6(1)(f) for this processing purpose.

The decision crystallises a doctrinal position that has been consistent since the EDPB's Guidelines 2/2019 on Article 6(1)(b): service personalisation beyond what is strictly necessary to deliver the requested service cannot piggyback on contractual necessity. For any business operating a digital platform and processing user data for advertising, this decision demands an honest audit of legal basis mapping — not a reclassification exercise, but a genuine assessment of whether the processing can survive without relying on contractual necessity or legitimate interest.

2. Clearview AI — Cross-Jurisdictional Enforcement Coordination (EDPB Task Force, 2025)

Clearview AI has been fined by the Italian Garante (€20 million), the French CNIL (€20 million), the Dutch AP (€30.5 million), the Greek HDPA (€20 million), and the UK ICO (£7.5 million). The coordinated EDPB task force approach — using Article 60 GDPR cooperation mechanisms between the lead supervisory authority and concerned authorities — has produced consistent findings across jurisdictions: Clearview's facial recognition database constitutes biometric data processing under Article 9 GDPR, lacks any valid legal basis, violates the principles of transparency (Article 5(1)(a)), data minimisation (Article 5(1)(c)), and storage limitation (Article 5(1)(e)), and the absence of an EU establishment does not exempt the company from the GDPR under Article 3(2).

The Clearview enforcement series is the clearest example yet that the EDPB's coordination mechanisms can produce synchronized, multi-territory enforcement against entities that attempt to avoid GDPR jurisdiction through structural design. For AI companies building or licensing biometric, facial recognition, or identity inference models, this series provides a comprehensive negative template: every element of these data flows has been found unlawful.

3. LinkedIn Ireland — Behavioural Advertising and Legitimate Interest (Irish DPC, October 2024) — €310 million

This decision — formally adopted in October 2024 but subject to ongoing compliance implementation through early 2026 — found that LinkedIn's reliance on consent, legitimate interest, and contractual necessity as legal bases for processing personal data for behavioural advertising was unlawful across all three grounds simultaneously. The DPC found: (1) consent was not freely given where users could not access core service functionality without agreeing to targeted advertising; (2) LinkedIn's legitimate interest assessment failed the balancing test because the processing severely overrode data subjects' reasonable expectations; and (3) contractual necessity could not extend to advertising personalisation.

The LinkedIn decision is the definitive authority that a company cannot run a multi-legal-basis argument in the alternative for the same processing purpose. If all three available non-special-category bases for a processing activity are unavailable, the processing must stop. Compliance orders accompanied the fine and LinkedIn has been required to restructure its advertising architecture for EEA users.

4. Uber Technologies B.V. — International Transfers via Legitimate Interest (Dutch AP, August 2024) — €290 million

The Dutch AP found that Uber had transferred personal data of European drivers (including sensitive data such as location history, identity documents, and payment information) to its US parent entity Uber Technologies Inc. in the absence of appropriate safeguards under Article 46 GDPR, following the invalidation of Privacy Shield in Data Protection Commissioner v. Facebook Ireland and Maximillian Schrems (C-311/18, "Schrems II", 16 July 2020). Critically, Uber had not implemented Standard Contractual Clauses (Commission Decision (EU) 2021/914) during a period when the EU-US Data Privacy Framework (Commission Decision (EU) 2023/1795, 10 July 2023) was not yet in force.

The Uber decision closes a gap that many multinational groups believed existed: the erroneous assumption that reliance on a parent company's Binding Corporate Rules or internal intragroup agreements could substitute for Chapter V-compliant mechanisms. The Dutch AP found that Uber's internal framework did not meet BCR requirements and that driver data flows were not covered by any approved mechanism during the relevant period.

5. TikTok — Children's Data and Privacy by Default (Irish DPC, September 2023, compliance monitored through 2025) — €345 million

TikTok Technology Limited was fined €345 million for multiple violations relating to child users (those aged 13–17) under Articles 5(1)(f), 12, 13, 24, 25, and 35 GDPR. The DPC found that TikTok's default settings made children's accounts public, that the "Family Pairing" feature did not require verification that the linked adult was the child's parent, and that TikTok failed to conduct adequate DPIAs for processing activities presenting high risk to child users. The fine included a compliance order requiring comprehensive redesign of the minor-user experience.

As of Q1 2026, the DPC is conducting a second investigation into TikTok's data transfers to China under Article 46, with a preliminary finding indicating that TikTok's technical and organisational measures under its SCCs may be insufficient given the access rights granted to Chinese personnel under the Cybersecurity Law of the People's Republic of China. This investigation is expected to conclude in H2 2026 and represents the most significant pending transfer enforcement action in the current cycle.


Emerging Enforcement Patterns: What the Decisions Reveal

Across the 15 most significant decisions of 2025–2026, five doctrinal patterns emerge with sufficient consistency to be treated as settled enforcement priorities rather than isolated positions.

Pattern 1: Consent Fatigue Enforcement

DPAs across the EU are systematically challenging consent mechanisms that technically meet the Article 7 GDPR formal requirements but fail under the substantive test of being "freely given." The CNIL, in its 2025 annual enforcement report, identified that 68% of investigated consent banners contained at least one element that rendered consent invalid: equal prominence between "accept" and "reject" options was absent in 41% of cases; cookie walls conditioning service access on consent were found in 23%; and granular choice architecture that imposed significant friction on refusal was found in 29%. The CNIL's enforcement approach — issuing formal reprimands with 30-day remediation windows before formal fines — has proven effective in driving rapid market-wide corrections.

For any organisation using a Consent Management Platform, the current standard is clear: the "reject all" option must be as accessible and prominently displayed as "accept all," withdrawal of consent must be as easy as giving it (Article 7(3) GDPR), and the absence of consent cannot result in denial of a service that the user has a reasonable expectation of accessing.

Pattern 2: Automated Decision-Making and AI Under Article 22

The EDPB's Guidelines 05/2022 on the use of facial recognition technology and its 2024 Opinion on AI systems under the GDPR have crystallised enforcement expectations for automated decision-making. Article 22 GDPR prohibits decisions based solely on automated processing that produce legal or similarly significant effects, absent explicit consent, contractual necessity, or Member State law authorisation. DPAs in Germany (BfDI), France (CNIL), and the Netherlands (AP) have opened formal investigations into AI-powered credit scoring, HR screening, and fraud detection systems where the human review component is performative rather than substantive.

The critical compliance threshold is whether human review constitutes a "meaningful" intervention capable of overriding the automated output. The EDPB's position, stated in Guidelines 05/2022, is that rubber-stamp review by an overwhelmed compliance officer reviewing 400 automated decisions per day does not satisfy Article 22(3)'s requirement for suitable safeguards, including "the right to obtain human intervention on the part of the controller."

Pattern 3: International Transfers — Post-DPF Scrutiny

The EU-US Data Privacy Framework (DPF), adopted via Commission Decision (EU) 2023/1795, has stabilised transfers to DPF-certified US organisations, but it has not eliminated transfer risk. Three enforcement patterns have emerged post-DPF: (1) DPAs are scrutinising whether US organisations have accurately self-certified the scope of their processing and whether that scope aligns with their actual data flows; (2) transfers to sub-processors and onward transfers within US corporate structures are being tested against DPF Principle 9 (Accountability for Onward Transfer); and (3) transfers to countries without adequacy decisions (including India, Brazil, and China) continue to require Article 46 mechanisms, with DPAs demanding documented Transfer Impact Assessments (TIAs) that address specific legal regimes in the destination country.

The Austrian DSB's 2025 decision against a European subsidiary of a global consulting firm found that the firm's reliance on Standard Contractual Clauses for transfers to India was insufficient because its TIA had not adequately assessed the Information Technology (Amendment) Act 2008 and the associated surveillance powers under Section 69 of that Act. The fine (€8.5 million) was not headline-grabbing, but the doctrinal contribution — that a TIA must engage substantively with surveillance law in the destination country, not merely assert that SCCs are in place — has been widely cited by other DPAs.

Pattern 4: Data Retention as a Standalone Violation

The storage limitation principle (Article 5(1)(e) GDPR) has historically been treated as a subsidiary finding in decisions primarily focused on legal basis or transfer violations. In 2025, DPAs in Spain (AEPD), Italy (Garante), and Germany (multiple Länder DPAs) issued standalone retention enforcement decisions against mid-market companies that had failed to implement automated deletion schedules. The Spanish AEPD imposed 11 fines in Q3 2025 alone for retention violations, averaging €85,000 each. The Garante issued a €2.1 million fine against an Italian financial institution for retaining customer data 14 years beyond the end of the contractual relationship without a documented legal basis for extended retention.

The practical implication is that data retention policies — including their technical implementation in production databases, backup systems, and third-party processors — are now primary audit targets, not secondary compliance documentation exercises.

Pattern 5: Processor Liability Under Article 28

The EDPB's Guidelines 07/2020 on the concepts of controller and processor established that a processor that goes beyond the documented instructions of the controller becomes a joint controller or an independent controller for the excess processing, with corresponding direct liability. In 2025, both the French CNIL and the Dutch AP imposed direct fines on data processors — a cloud service provider and a payroll processing company respectively — for processing personal data beyond the scope of the data processing agreement and for failing to implement technical and organisational measures meeting the Article 32 GDPR security standard.

For legal teams negotiating data processing agreements under Article 28, the post-2025 standard requires: explicit delineation of processing purposes, sub-processing approval mechanisms with a minimum 30-day prior notice period, documented audit rights, and breach notification procedures that meet the 72-hour standard of Article 33 GDPR at the processor level (to ensure the controller can meet its own notification timeline).


High-Risk Sectors: Where DPAs Are Focusing Investigative Resources

The following table summarises sectoral enforcement intensity based on formal investigations opened, fines imposed, and EDPB coordinated enforcement actions for 2024–2026.

SectorPrimary GDPR ViolationsKey DPAs ActiveEstimated Fine Exposure
Social Media / AdtechArt. 6 legal basis, Art. 46 transfers, Art. 5 purpose limitationIrish DPC, French CNIL, Dutch AP€100M–€1.2B+
Financial Services (Retail Banking)Art. 5(1)(e) retention, Art. 22 automated decisions, Art. 9 special categoriesGerman BfDI, Spanish AEPD, Italian Garante€500K–€50M
Healthcare / MedTechArt. 9 special categories, Art. 32 security, Art. 35 DPIAGerman Länder DPAs, French CNIL, Irish DPC€200K–€20M
HR Technology / RecruitmentArt. 22 automated profiling, Art. 6(1)(f) legitimate interest, Art. 13 transparencyBelgian APD, Dutch AP, Swedish IMY€100K–€10M
E-commerce / RetailArt. 7 consent (cookies), Art. 5(1)(e) retention, Art. 28 processor obligationsFrench CNIL, Spanish AEPD, German DPAs€50K–€5M
AI / BiometricsArt. 9 biometric data, Art. 22 automated decisions, Art. 35 DPIAItalian Garante, French CNIL, EDPB task forces€5M–€50M+
Public SectorArt. 5 principles, Art. 32 security, Art. 25 privacy by designAll national DPAsLimited by national caps

Financial services are undergoing a particular intensification. The intersection of GDPR with the Digital Operational Resilience Act (Regulation (EU) 2022/2554, "DORA"), which became applicable from January 2025, has created a dual regulatory scrutiny model: the same security incident that triggers an EBA/NCA notification under DORA may simultaneously require a GDPR breach notification to the competent SA under Article 33 GDPR within 72 hours. Legal teams in financial services must now operate joint incident response protocols that serve both frameworks simultaneously.

The healthcare sector faces equivalent complexity from the intersection of GDPR with the European Health Data Space Regulation (expected to enter application in phases from 2026–2027), which introduces new obligations for electronic health data holders and creates sector-specific data quality and access obligations that sit alongside rather than displace GDPR requirements.


The One-Stop-Shop Under Reform: What Changes for Cross-Border Groups

The One-Stop-Shop (OSS) mechanism — under which a controller or processor with its main establishment in an EU Member State is supervised primarily by the SA of that Member State for cross-border processing activities — has been the subject of sustained criticism since 2020. The Irish DPC, as lead supervisory authority for the majority of large US technology companies established in Ireland, faced consistent pressure from concerned SAs who believed the Dublin-based review process was insufficiently aggressive. The EDPB's use of Article 65 GDPR dispute resolution binding decisions became standard practice, culminating in binding decisions that overrode the Irish DPC's draft decisions in the WhatsApp, Meta, and LinkedIn cases.

The GDPR Procedural Regulation (Regulation (EU) 2024/2015, applicable from July 2026) fundamentally restructures the OSS complaint process. Key changes include:

  1. Mandatory admissibility determination timelines: Lead SAs must determine complaint admissibility within one month of receipt. The prior practice of holding complaints in informal queues for two to three years without formal admissibility decisions is eliminated.

  2. Enhanced complainant rights: Data subjects whose complaints have not resulted in a final decision within 18 months may escalate to the EDPB for a binding ruling on whether the lead SA has acted without undue delay. This creates a hard deadline that did not exist before.

  3. Simplified EDPB dispute resolution: The Article 65 binding decision process has been streamlined with shorter deadlines for the referral and resolution cycle, reducing the total time from draft decision to final binding decision from the current average of 14 months to a target of 8 months.

  4. Fines for procedural non-compliance: SAs that repeatedly fail to comply with EDPB binding decisions may be referred to the European Commission for infringement proceedings against the relevant Member State under Article 258 TFEU.

For multinational groups with their EU main establishment in Ireland, Luxembourg, or the Netherlands, the Procedural Regulation changes the strategic calculus. The historical advantage of OSS — slower processing of complaints via a single lead authority — is substantially reduced. Cross-border data processing governance must now be designed to withstand review within an 18-month enforcement cycle rather than the 24–48 month cycles of prior years.


Implications for Compliance Architecture: Six Structural Requirements

The 2025–2026 enforcement cycle has converted theoretical compliance requirements into documented enforcement standards. The following six structural requirements now have clear regulatory backing:

1. Records of Processing Activities (Article 30) as a Primary Control

The AEPD, CNIL, and BfDI have all used Article 30 ROPA inadequacies as an independent Article 5(2) accountability violation, separate from any underlying processing violation. ROPAs must reflect current technical architecture, including specific sub-processors and data flows to third countries, not a generic description updated at annual review cycles.

2. Transfer Impact Assessments Are Mandatory, Not Optional

No cross-border transfer to a non-adequate country using SCCs should proceed without a documented TIA. The TIA must assess the legal framework of the destination country — specifically surveillance and access laws — and must be updated when the legal framework of the destination country changes materially. The EDPB's Recommendations 01/2020 on supplementary measures remain the governing standard.

3. DPIAs Are Required Before Deployment, Not After

Article 35 GDPR requires a DPIA prior to processing that is likely to result in high risk. Conducting a DPIA after deployment — as a retroactive justification exercise — has been cited as an aggravating factor in at least seven 2025 enforcement decisions. The EDPB's Guidelines 09/2022 on DPIAs under the AI Act (cross-referencing GDPR Article 35 requirements) establish that AI systems processing personal data for automated decision-making require DPIAs by default.

4. Processor Oversight Is a Controller Obligation

Article 28(1) GDPR requires controllers to use only processors providing sufficient guarantees. Enforcement has moved from theoretical processor vetting standards to practical audit execution: controllers are now expected to maintain evidence of having exercised their Article 28(3)(h) audit rights, not merely to have included audit rights clauses in DPAs.

5. Privacy by Design Is Technically Verified

Article 25 GDPR requires privacy by design and by default to be implemented "at the time of the determination of the means for the processing and at the time of the processing itself." DPAs are now requesting technical architecture documentation — including data flow diagrams, database schema descriptions, and system access logs — to verify that privacy by design commitments are implemented in production systems rather than described only in policy documents.

6. Breach Response Requires Pre-Positioned Capacity

The 72-hour breach notification requirement of Article 33 GDPR is a hard deadline that DPAs have begun to enforce with increasing rigour. Several 2025 decisions identified notification delays of 80–120 hours as independent violations of Article 33, separate from the underlying security failure. Pre-positioned incident response capacity — including 24-hour monitoring, defined escalation paths that reach the DPO and legal team within hours of detection, and pre-drafted notification templates — is no longer a best practice but an enforcement expectation.


Practical Takeaways for Corporate Counsel

1. Audit Legal Basis Mapping Against Current Doctrinal Standards, Not GDPR Text Alone

The EDPB's guidelines on Article 6(1)(b) (contractual necessity), Article 6(1)(f) (legitimate interest), and consent validity have developed considerably since 2018. A legal basis mapping exercise that relied on GDPR text and early DPA guidance may be out of date. Commission the DPO and external privacy counsel to conduct a legal basis audit against current EDPB guidelines, and specifically test whether any processing purpose relies simultaneously on multiple legal bases in the alternative — the LinkedIn decision makes clear that this approach exposes all bases to invalidation rather than providing redundant coverage.

2. Treat International Transfer Compliance as a Living Programme, Not a One-Time Exercise

The EU-US DPF provides a workable mechanism for transfers to certified US entities, but it requires active monitoring: US organisations' DPF certifications lapse annually, and the adequacy decision itself remains politically vulnerable to a future Schrems III challenge. For transfers relying on SCCs, maintain TIAs that are reviewed annually and updated when legislation in the destination country changes (including through secondary legislation and agency guidance). Assign a named legal team member responsibility for monitoring transfer compliance on a quarterly basis.

3. Integrate GDPR Breach Response with Sector-Specific Notification Obligations

If your organisation operates in financial services, healthcare, critical infrastructure, or digital services, map Article 33 GDPR notification against your sector-specific incident notification obligations under DORA, NIS2 Directive (Directive (EU) 2022/2555), or the Medical Device Regulation (Regulation (EU) 2017/745). Maintain a single incident response protocol with a notification decision tree that handles all applicable frameworks simultaneously, and assign a notification officer with authority to file notifications without further approval delays.

4. Prepare for the GDPR Procedural Regulation Effective July 2026

If your organisation is subject to OSS supervision, brief your board and senior leadership on the implications of the Procedural Regulation before it applies. The elimination of indefinite complaint queuing means that enforcement actions previously delayed for years may accelerate significantly. Review open DPA investigations and complaints, assess exposure, and consider whether voluntary remediation and early engagement with the lead SA is preferable to waiting for a formal draft decision.

5. Establish AI Governance That Satisfies Both GDPR Article 22 and the AI Act

The AI Act (Regulation (EU) 2024/1689) applies to AI systems placed on the market or put into service in the EU, with obligations for high-risk systems under Annex III taking effect from August 2026. High-risk AI systems that process personal data are subject to simultaneous GDPR and AI Act obligations, including mandatory DPIAs (GDPR Article 35), fundamental rights impact assessments (AI Act Article 27), and transparency requirements under both frameworks. Map all AI systems against the AI Act's risk classification, and for each system that processes personal data, establish whether a DPIA has been conducted, whether the human oversight requirement of Article 22(3) GDPR is substantively (not merely formally) satisfied, and whether the system's data governance documentation meets Article 10 AI Act standards.


Conclusion

The GDPR enforcement cycle of 2025–2026 is not a departure from prior trends — it is their logical culmination. SAs that spent 2018–2022 building investigative capacity, developing doctrinal positions, and testing cooperation mechanisms are now deploying those assets systematically. The result is a regulatory environment in which the question is no longer whether non-compliant processing will be detected, but when and by which authority.

For the General Counsel or CLO, the appropriate response is neither panic nor resignation but structured investment: in legal basis audits that engage with current EDPB doctrine, in transfer compliance programmes that function as continuous monitoring rather than annual certification, in AI governance frameworks that anticipate the converging requirements of the GDPR and the AI Act, and in breach response capacity that treats 72 hours as a hard constraint to design around rather than a target to aspire to.

The organisations that will manage GDPR exposure effectively in 2026 and beyond are those that have converted regulatory text, EDPB guidelines, and enforcement case law into specific, testable, technically-implemented controls — and those that have assigned clear legal ownership of each control to a named team member who is accountable for its continued effectiveness.


Legal Disclaimer: This article is provided for informational and educational purposes only. It does not constitute legal advice and does not create an attorney-client relationship. The analysis reflects publicly available enforcement decisions, legislative texts, and regulatory guidance as of the date of publication. Legal and regulatory frameworks may change, and the application of any framework to specific facts requires qualified legal counsel. Readers should not act on the basis of this article without seeking independent professional advice tailored to their specific circumstances.

Need expert advice on this topic?

Our team at Morvantine specializes in exactly these issues. Get in touch for a consultation.

Get in Touch