Blogs
Nikhil Singh

Author

  • Published: May 14 2025 05:43 PM
  • Last Updated: May 24 2025 03:21 PM

NOYB challenges Meta’s use of EU user data for AI training; legal action could cost billions if plans move ahead without user consent.


Newsletter

wave

Meta is in hot water again—this time in Europe. They’re planning to start using people’s personal info to train their AI models. And a privacy rights group from Austria called NOYB is not happy about it. They sent Meta a legal warning, saying the way Meta wants to collect this data just doesn’t sit right under European privacy laws.

Why the Privacy Group Is Upset

NOYB, which is run by privacy activist Max Schrems, is calling out Meta for using what's called a "legitimate interest" as a reason to take people’s data. Basically, Meta thinks it’s okay to grab user info without asking for clear permission, just because they have a reason to. But NOYB says that doesn’t work—especially for training AI systems.

  • Schrems pointed out that the European Court of Justice already ruled against Meta using this same excuse for targeted ads.

  • So the big question is: if that was a no-go, how can it be okay now for AI?

  • NOYB wants Meta to ask users first, not just give them a way to opt out later.

  • They also said Meta should try using anonymous data if it really needs to train its AI models.

Honestly, it sounds like NOYB is saying: “If you’re going to use people’s info, at least be upfront and ask them.”

What Meta Is Planning to Do

Meta isn’t staying silent. They’ve said they’ll be giving users a form soon where they can say no to their data being used for AI. And they’re also saying that certain things—like private messages and data from users under 18—won’t be touched.

But that’s not good enough for some. Critics say this isn’t very clear. Like, how many people even know what kind of data is being taken? And is that form really going to be easy to find or understand?

  • There’s concern that many users won’t even realize this is happening.

  • The language Meta is using is kind of vague, which makes people suspicious.

  • NOYB is basically saying: “Don’t just quietly roll this out and hope no one notices.”

What Could Happen Next

If Meta doesn’t change its plan or respond properly by May 21, NOYB might take things further. They’re talking about possible legal action, including a collective lawsuit—which could be a huge deal.

  • There’s this law in the EU called the Collective Redress Mechanism. It lets lots of users join together to sue for damages.

  • Schrems estimates that if even a fraction of Meta’s users in Europe joined in and asked for compensation—say €500 each—the company could be looking at billions in fines.

  • NOYB also wants more transparency: they’re asking Meta to clearly say what kind of data they’re collecting, what it’s used for, and what kind of AI features are being trained with it.

This whole situation shows how shaky the line still is between using tech to improve AI and protecting people’s privacy. Right now, a lot of people—and not just NOYB—are watching to see what Meta does next

FAQ

Meta plans to use user data for AI training without explicit consent, which violates EU privacy rules, according to NOYB.

The Austrian privacy group NOYB, led by activist Max Schrems, filed the legal notice.

They want Meta to stop using user data without consent and switch to an opt-in model.

Meta plans to begin data collection for AI training on May 27, 2025, in Europe.

Meta may face an injunction, fines, or even a class-action lawsuit across the EU.

Search Anything...!