As women’s football climbs into the global spotlight, it’s facing the backlash that often follows breakthrough. Women’s EURO 2025 was a tournament of record audiences, elite performances — and a dark digital undercurrent. UEFA’s online abuse programme, launched in 2022 and applied here in collaboration with Meta, TikTok and X, aimed to protect players, coaches and referees from targeted hate. But with rising abuse, patchy enforcement, and vague thresholds, it’s time to ask: Is this system good enough - or just good optics?
📊 Supporting Stats:
1,901 abusive posts were flagged - a 7.3% increase from 2022.
Of those, only 19.1% were deemed serious enough to be reported directly to platforms.
Just 66.6% of reported posts were actioned - leaving over a third untouched.
Tier 1 abuse (most severe) dropped, but Tier 2 and 3 abuse - more indirect but still harmful — rose.
Spain, England and Germany were the most affected teams; players received 67.3% of abuse.
Across Meta, TikTok and X, results varied: TikTok removed 100% of flagged content, Meta removed 91%, while X’s numbers remain opaque.
In total, over 19,500 abusive posts have been identified across 16 UEFA competitions over three years.
🧠Decision: Did It Work?
Not enough.
While the intent behind UEFA’s online abuse programme is commendable, the outcomes suggest a system still lagging behind the scale and complexity of the problem. Abuse is rising, becoming more coded, and platforms remain inconsistent in enforcement.
Only 1 in 5 abusive posts were severe enough to be reported. But who decides that - and by what standard? With Tier 2 and 3 content rising, the nuance of online hostility - sarcasm, dog whistles, baiting - is being missed. For players facing constant low-level abuse, that’s not just a moderation gap - it’s a failure of care.
The platforms, too, aren’t pulling equal weight. TikTok showed strong enforcement. Meta delivered reasonably. But X - a known hotspot for real-time abuse - still provides little transparency. The lack of standardised accountability across platforms means safety is subject to platform policy, not player need.
📌 Key Takeouts:
What happened: UEFA monitored abuse during Women’s EURO 2025, flagging nearly 2,000 posts and partnering with social media platforms to take action.
What fell short: Abuse increased, with over 33% of reported content still left online. Less extreme - but more pervasive - forms of abuse are slipping through.
Who was hit: Players, especially those from Spain, England and Germany. The final saw 468 flagged posts alone.
Platform response: Inconsistent. TikTok led in enforcement, Meta was solid, and X remains vague on data and action.
Brand signal: Surface-level solutions aren’t keeping up with the realities of digital hate - especially in women’s sport, where visibility often invites aggression.
Strategic takeaway: Real protection means more than detection - it requires action, accountability and a platform-agnostic standard for abuse.
🔮 What We Can Expect Next:
UEFA’s three-year programme ends here - but this can’t be the end of investment in athlete protection. With women’s football growing commercially and culturally, brands will be expected to do more than just show up - they’ll need to stand up.
Expect louder calls for independent moderation frameworks, real-time takedown powers, and greater legal escalation tools. If governing bodies and sponsors don’t push for systemic change, athletes will be left to fend for themselves - and the goodwill that surrounds the women’s game could curdle into distrust.
The message is clear: visibility without protection is no longer acceptable. Not for players. Not for fans. And not for the brands who want to be part of this moment.