• Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
Tech News, Magazine & Review WordPress Theme 2017
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
  • Home
  • Blog
  • Android
  • Cars
  • Gadgets
  • Gaming
  • Internet
  • Mobile
  • Sci-Fi
No Result
View All Result
Blog - Creative Collaboration
No Result
View All Result
Home Gadgets

Why the EU’s anonymisation method may not survive the GDPR test

May 6, 2026
Share on FacebookShare on Twitter

Sergei Vassilvitskii, distinguished scientist at Google since 2012, has written to Brussels warning that the Commission’s proposed anonymisation scheme for forced search-data sharing is, by his red team’s own demonstration, breakable in 120 minutes. The decision deadline is 27 July.

There is a familiar genre of corporate complaint in EU regulatory proceedings: a US technology company protests a Brussels rule, frames the protest as a defence of user welfare, and is dismissed by regulators as making a self-interested argument in privacy clothing. The Reuters exclusive published on Tuesday makes that dismissal harder than usual.

Sergei Vassilvitskii, who has been a distinguished scientist at Google since 2012 and is one of the most-cited researchers in the field of differential privacy, has written to the European Commission warning that the Commission’s proposed anonymisation method for forced search-data sharing is breakable in less than two hours.

His exact words, in written comments to Reuters republished in the syndicated wire, were: “We are concerned because the EC’s approach to anonymisation fails to protect Europeans’ privacy: our red team managed to re-identify users in less than two hours.”

TNW City Coworking space – Where your best work happens

A workspace designed for growth, collaboration, and endless networking opportunities in the heart of tech.

The number is unusually specific. It is also, on the technical literature, plausible.

What the EU is actually requiring

The proceeding sits inside the Digital Markets Act, the EU’s flagship competition framework for so-called gatekeeper platforms. On 27 January 2026, the Commission opened formal specification proceedings against Google under Article 6(11) of the DMA, which obliges gatekeeper search engines to grant third-party rivals access to anonymised ranking, query, click and view data on fair, reasonable and non-discriminatory (FRAND) terms.

Per the Commission’s own press materials, the proceeding is intended to specify, with operational precision, four things: the scope of the data that has to be shared, the anonymisation method that will be applied to it, the conditions of access, and the eligibility of AI chatbot providers (OpenAI, Anthropic, and others) to receive it.

Google’s compliance deadline is 27 July 2026. Failure to meet it could result in DMA charges with fines up to 10 per cent of the company’s global annual revenue. The Register noted in mid-April that Google has accumulated roughly €9.71bn in European antitrust fines since 2017, so the financial calculus on this proceeding is, even by Google’s standards, material.

What makes the proceeding unusual is that the proposed remedy, search-data sharing, is itself privacy-sensitive in ways most DMA remedies are not. The Information Technology and Innovation Foundation, in a 1 May filing, flagged the same fundamental tension: forcing a search engine to make user-search data available to rivals is, by definition, expanding the surface area on which user-search data can be exploited.

The Chamber of Progress raised parallel concerns the same week, and CyberInsider warned that the proposal could enable large-scale surveillance if anonymisation methods proved insufficient. Vassilvitskii’s intervention is the technical specification of that concern.

Anonymisation, in the modern privacy literature, is not a binary property of a dataset. It is a probabilistic property that depends on (a) the data itself, (b) the auxiliary information an attacker has access to, and (c) the technique used to anonymise. Vassilvitskii’s research career, per his Google Research profile, has focused specifically on differential privacy, the mathematical framework for measuring and bounding the re-identification risk in released datasets. His 2025 ACM SIGKDD paper on differentially private datasets for Google’s Topics API is one of the more rigorously documented applications of the framework to a live commercial system.

The two-hour claim, in that frame, is an empirical statement, not a rhetorical one. Vassilvitskii’s red team, working from a sample of search-engine query data anonymised under the Commission’s proposed method, was able to re-identify individual users within two hours. The anonymisation technique the Commission has proposed, in his framing, falls into a category of methods (typically combinations of pseudonymisation, aggregation, and noise injection) that have been demonstrated for over a decade to be vulnerable to linkage attacks when the underlying queries are sufficiently distinctive.

That vulnerability is not theoretical. In 2006, an anonymised release of AOL search data led to multiple users being identified by name within days, including a famous New York Times reconstruction of one specific user. The same principle applies, more starkly, to modern search data, which is now vastly more granular than the 2006 corpus and far easier to cross-reference against the public web.

There is a delicate political question Google has to navigate here. The company has spent the past decade arguing, sometimes credibly and sometimes not, that user privacy is one of its core commitments. The same company is now subject to a Commission proceeding that seeks to compel it to share user data with rivals on competition grounds.

The argument that doing so would harm user privacy, regardless of whether it is technically correct, is open to the obvious counter-charge that Google’s privacy concern has activated suspiciously alongside its commercial interest. 

The Vassilvitskii intervention is, on the available reporting, an attempt to defuse that counter-charge by anchoring the privacy argument in a researcher whose career independence and technical credibility are harder to dismiss. He has not just written a letter; he has met with Commission officials in person on Wednesday and has, per his own framing, proposed alternative anonymisation guardrails that would meet the DMA’s competitive intent without producing the re-identification risk his red team has demonstrated.

Whether the Commission accepts that framing is a separate question. The political pressure on the proceeding runs in both directions: AI competitors (OpenAI, Anthropic, Perplexity, Mistral) want access to Google’s search data on the most permissive possible terms, both because it would substantially improve their commercial positions in retrieval-augmented generation and because the precedent itself, that gatekeeper search data is shareable on FRAND terms, is strategically valuable. Privacy advocates and researchers, of which Vassilvitskii is now publicly one, want the most restrictive possible terms. The Commission has six months to thread the needle.

The wider regulatory frame

Vassilvitskii’s intervention lands inside a Brussels regulatory environment that is itself under unusual strain. TNW reported earlier this year on Europe’s broader struggle over whether to dismantle parts of its own regulatory architecture in order to compete more effectively with the US, with several recent moves to soften AI Act provisions and accelerate competitive responses to US dominance in the model layer.

We have tracked the AI Act’s enforcement timeline, with high-risk system rules entering into force in August. The DMA proceeding against Google sits alongside that calendar, but on the competition rather than the safety axis.

There is also a wider transatlantic dimension. TNW has covered the EU’s tightening posture on Chinese-origin connectivity infrastructure in parallel, and the broader picture is one in which Europe is simultaneously trying to constrain US gatekeepers (DMA), Chinese vendors (Cybersecurity Act recommendations), and its own regulatory drag on European AI startups.

The Vassilvitskii letter complicates that trilemma by raising the possibility that the EU’s own competition remedies, designed to weaken US gatekeeper positions, are themselves creating user-privacy exposure that the EU’s privacy framework (GDPR) was built to prevent.

It is, on a sober read, the kind of regulatory tension Europe has not previously had to resolve. TNW’s earlier coverage of the Italy-OpenAI ChatGPT GDPR enforcement established the principle that EU data-protection law applies extraterritorially to AI systems trained on European data.

The same principle, applied to the DMA’s data-sharing remedy, suggests that any anonymisation method the Commission specifies has to clear not just the DMA’s competition test but also the GDPR’s privacy test. The Commission has, in effect, written itself a problem in which the two tests pull in opposite directions.

What happens next?

Three things will determine the trajectory of the proceeding. The first is whether the Commission revises its anonymisation specification before the 27 July decision deadline. Vassilvitskii’s red team result, if reproduced or independently confirmed, would make a continued specification of the original method increasingly difficult to defend.

TNW has covered the EU’s broader push for digital sovereignty, and an outcome in which the Commission’s headline remedy fails its own privacy test would be the kind of outcome that European regulators tend to avoid by quiet revision rather than public reversal.

The second is whether the AI chatbot providers, OpenAI, Anthropic, Mistral, and others, who are the ostensible beneficiaries of the data-sharing rule, take a public position on the privacy question. So far they have not. Their commercial interest in obtaining the data on the most permissive possible terms is in tension with their public reputational interest in being seen as privacy-respecting model operators. The longer the Vassilvitskii framing remains uncontested, the harder that tension becomes to manage.

The third is whether the European Court of Justice eventually has to rule on whether DMA remedies that produce GDPR-violating outcomes are themselves legal under EU law. That is the kind of constitutional question Brussels has, until now, managed to avoid. The Vassilvitskii letter makes it more plausible that the question is asked, by Google in court, by privacy advocates in court, or by a national data-protection authority pre-empting the Commission’s specification.

None of this excuses Google’s commercial interest in the outcome. The company would, by any honest reading, prefer not to share its search data with rivals at all, and its privacy argument is being deployed in service of a pre-existing competitive position.

]What has changed, is that the privacy argument is now being made by someone whose technical credibility is harder to write off and whose career has been spent in the specific sub-field that the Commission’s proposed remedy depends on.

The decision deadline is 27 July. Vassilvitskii’s two-hour figure has, on the public record, been entered into the proceeding’s evidence base. Whether it produces a revision, a delay, a litigation track, or a quiet political accommodation is the question Brussels has roughly twelve weeks to answer. 

Next Post

The DJI Flip drone has dropped to its lowest-ever price at Amazon — save $130

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

No Result
View All Result

Recent Posts

  • Stephen Colbert challenges Barack Obama to a tense ‘wastepaper basketball’ rematch
  • The best AI chatbot for health concerns isn’t Gemini or ChatGPT
  • Hannah Waddingham’s ‘SNL UK’ vocal warm-up must not be interrupted
  • Thailand approves $29bn in projects, with TikTok’s data-centre expansion
  • Valve’s sold-out Steam Controller just became a modder’s playground

Recent Comments

    No Result
    View All Result

    Categories

    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi
    • Home
    • Shop
    • Privacy Policy
    • Terms and Conditions

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    No Result
    View All Result
    • Home
    • Blog
    • Android
    • Cars
    • Gadgets
    • Gaming
    • Internet
    • Mobile
    • Sci-Fi

    © CC Startup, Powered by Creative Collaboration. © 2020 Creative Collaboration, LLC. All Rights Reserved.

    Get more stuff like this
    in your inbox

    Subscribe to our mailing list and get interesting stuff and updates to your email inbox.

    Thank you for subscribing.

    Something went wrong.

    We respect your privacy and take protecting it seriously