Appealing into a void - can the DSA protect Europe's marginalised?
By Repro Uncensored
1. The Digital Services Act and the Promise of Rights-Based Enforcement
In theory, the Digital Services Act (DSA) represents one of the strongest attempts globally to regulate platform behavior. In practice, however, evidence gathered by Repro Uncensored demonstrates that communities most affected by content moderation, particularly reproductive health providers, sex worker led organizations, queer organizations, and feminist groups, continue to experience systemic harms without effective remedy, even within European jurisdictions covered by the DSA.
The DSA Act was introduced to address long-standing harms caused by opaque and arbitrary platform governance. At its core, the DSA aims to rebalance power between large online platforms and users by introducing enforceable obligations related to transparency, due process, systemic risk mitigation, and the protection of fundamental rights. For very large online platforms and very large online search engines, the regulation establishes a centralized European level enforcement regime, granting the European Commission investigative and sanctioning powers, including access to platform data, interim measures, and fines of up to six percent of global annual turnover. However, users across different fields remain powerless in the face of opaque, unfair and ineffective in-platform appeals for demoted or removed content, and for deleted accounts.
2. Repro Uncensored’s Investigation Documenting a Wave of Platform Censorship
Between October and December 2025, Repro Uncensored investigated widespread content moderation harms affecting sexual and reproductive health, abortion access, sex worker-led, queer, and feminist organizations across digital platforms. The documentation, data collection, and analysis were carried out by Repro Uncensored as part of its ongoing monitoring and incident reporting work, and were later reported on by international media.
Through direct reports, case documentation, and longitudinal tracking, Repro Uncensored identified over 50 organizations globally whose accounts were removed, restricted, or repeatedly disrupted across Facebook, Instagram, and WhatsApp. Many of the affected organizations were based in the Netherlands, Belgium, and the United Kingdom, operating in legal contexts where abortion information, sexual health education, LGBTQ advocacy, and sex worker organizing are lawful. Additional cases were documented across other European countries and beyond.
In 2025 alone, Repro Uncensored documented 210 incidents of account removals and severe restrictions, compared with 81 incidents the previous year. This escalation reflects a structural pattern rather than isolated enforcement errors.
Documented actions included permanent account deletions without prior warning, shadow banning and algorithmic suppression without notification, removal of non explicit educational or artistic content under sexual or nudity policies, and repeated blocking and reinstatement of abortion and health hotlines, particularly on WhatsApp.
In most cases, platforms failed to provide clear explanations, identify specific policy violations, or offer functional appeals processes, mirroring academic research from the past decade.
Dr Carolina Are, a fellow at the London School of Economics and Political Science, says:
“I’ve been observing content moderation at the intersection of online abuse and censorship for the past decade. Sex workers, educators, LGBTQIA+ users, journalists and activists have been both overly affected by censorship and online harms, and under-served by regulatory efforts. My research documents the devastating financial and emotional impacts of censorship against marginalised communities, who have huge amounts of money due to censorship, and have experienced depression, stress and stigma as a result of unsteady workplaces, loss of livelihood and governance discrimination.
Censorship’s impact would not be so earth-shattering on their lives if in-platform appeals worked, but they don’t: my studies have shown social media appeals are ineffective at best and negligent at worst. Participants appeal into a void, often with no update or log to check the status of their appeal and no access to a human ever, leading some to pay thousands to hackers as a last resort to recover their means of work and connection.”
3. Beyond Content Moderation: Material Harm to Communities
The harms documented by Repro Uncensored extend far beyond content visibility. Many of the affected organizations provide time-sensitive health information, including abortion access guidance and crisis support. Disruptions to digital infrastructure, such as the blocking of messaging services or sudden account deletions, can have serious consequences for people seeking care.
Organizations reported loss of access to tens of thousands of users overnight, erasure of years of educational archives and community history, inability to plan operations due to repeated and unpredictable enforcement, and increased pressure to self-censor to avoid future takedowns.
These impacts undermine access to information, freedom of expression, and collective organizing, raising direct concerns under the fundamental rights protections the DSA is meant to safeguard.
4. What the DSA Requires in Practice
Under the Digital Services Act, very large online platforms and very large online search engines are subject to heightened obligations, including identifying and mitigating systemic risks arising from their content moderation and recommender systems, providing users with clear and meaningful explanations for moderation decisions, ensuring accessible and effective appeals mechanisms, and addressing risks to fundamental rights including freedom of expression, access to information, and non discrimination.
The European Commission is empowered to investigate potential non-compliance, request internal platform data and algorithmic information, impose interim measures in cases of urgent harm, and issue significant financial penalties. These powers are explicitly designed to prevent exactly the kinds of opaque and discriminatory enforcement patterns documented by Repro Uncensored.
5. The Enforcement Gap From Legal Obligations to Lived Experience
Martha Dimitratou, Founder and Executive Director of Repro Uncensored, says:
“Through our investigation, we consistently saw that the harm does not stop at content removal. Organizations lose access to communities overnight, health information disappears, and years of collective knowledge are erased, all without explanation or meaningful recourse. The Digital Services Act promises accountability, but for the people most affected by platform enforcement, those protections remain invisible. When communities cannot even find where to appeal, let alone understand how to trigger enforcement, the DSA exists only on paper, not in practice.”
Despite the existence of this enforcement framework, Repro Uncensored’s findings point to a persistent and troubling gap between the DSA’s requirements and platform practices.
Affected organizations report no awareness of or no meaningful access to DSA-based remedies. Appeals remain largely automated, slow, or non-responsive. In several cases, accounts were reinstated only after media attention or public advocacy, rather than through formal appeals processes.
There is no evidence that interim measures have been deployed in response to documented urgent harms, such as the repeated disruption of abortion hotlines or the removal of lawful health information. This raises questions about how thresholds for serious user harm are being assessed and applied in practice.
Enforcement also remains opaque and inaccessible to communities. Organizations describe closed-door engagements with platforms that explicitly exclude critique or policy feedback, reinforcing existing power imbalances rather than correcting them.
6. The Inaccessibility of DSA Redress Mechanisms for European Users
A central barrier to the implementation of the Digital Services Act is that affected users and organizations across Europe cannot practically locate, understand, or access the mechanisms that are supposed to protect them.
While the DSA formally guarantees users the right to receive explanations for content moderation decisions and to challenge those decisions through appeals and out-of-court dispute settlement, Repro Uncensored’s investigation found that these rights are largely theoretical for European users, who identify a disconnect between their experiences and the DSA’s supposed protection.
Organizations affected by account removals or restrictions consistently reported that they did not know where or how to invoke DSA-specific remedies. Platform interfaces do not clearly reference the DSA, do not explain which enforcement regime applies, and do not distinguish between internal platform appeals and legally grounded DSA redress mechanisms. Users are typically directed to generic help centers, automated forms, or opaque account status pages that provide no meaningful legal pathway for challenge.
Even organizations based in the European Union, operating lawfully under national and European law, reported that they were unable to identify which Digital Services Coordinator was competent, how to submit a complaint under the DSA, or whether their case had been escalated beyond automated review systems. This lack of clarity shifts the burden of legal interpretation onto users, many of whom lack the resources or legal expertise to navigate fragmented regulatory structures.
As a result, appeals processes remain functionally inaccessible. Organizations described submitting multiple appeals without receiving responses, receiving standardized messages unrelated to the substance of their case, or having accounts reinstated without explanation after prolonged disruption. In no documented cases did organizations report a clear and traceable DSA-based enforcement pathway that resulted in a timely remedy.
The absence of visible user-facing DSA mechanisms prevents the accumulation of enforcement signals, obscures patterns of systemic harm, and limits the ability of regulators to respond proactively. Enforcement instead becomes dependent on media exposure or exceptional advocacy rather than routine accountability, signalling a disconnect between the DSA and the communities it is supposed to protect and showing media’s selective impact on platform governance, outside user-driven accountability.
7. Structural Drivers of Non-Implementation
The gap between the DSA’s legal framework and its enforcement is not accidental. Repro Uncensored’s investigation points to several structural drivers.
These include continued reliance on platform self-reporting despite clear conflicts of interest, insufficient recognition of civil society documentation as enforcement-relevant evidence, lack of community-centered escalation pathways within DSA enforcement mechanisms, and global US-centric moderation systems applied without regard to European legal or social context.
Dr Are says:
“Platforms have a history of playing it both ways, marketing themselves to us as a civic space but governing as a private company, moved by economic interests and brand safety. My research has documented how they use gaslighting as a strategic communications strategy with users, the public and institutions alike, denying or minimising user experience to avoid scrutiny into their operations and accrue more power - until the evidence’s too much to deny. This means we, as a society, rely on regulators to compel them to be transparent and fair about their processes. Self-reporting simply isn’t enough: the DSA has a real chance to scrutinise in-platform appeals and operations in general, but its enforcement and its users’ ability to report issues with in-platform appeals and content moderation as a whole must serve communities, and not Big Tech.”
As a result, enforcement remains reactive, discretionary, and largely detached from the lived realities of those experiencing harm.
8. Implications for Rights, Democracy, and Trust
When regulatory protections exist on paper but fail in practice, trust in digital governance erodes. The continued censorship of lawful reproductive health, sex worker-led, and queer content within the European Union demonstrates that the DSA, as currently enforced, is not yet delivering meaningful protection to those most at risk of platform overreach.
If left unaddressed, this enforcement gap risks entrenching discrimination, weakening democratic participation, and normalizing the silencing of marginalized communities within Europe’s digital public sphere, creating a broader disconnect between citizens and European legislation.
9. Toward Meaningful Enforcement
Closing the gap between the DSA’s promise and reality requires a shift from formal compliance to substantive protection. Based on the evidence gathered, this includes:
Recognizing civil society documentation and incident reporting as valid enforcement evidence,
Mandating transparency and timelines for appeals and enforcement outcomes
Proactively deploying interim measures when access to health or safety information is disrupted
Conducting independent audits of automated moderation systems affecting protected groups
Publicly reporting on enforcement actions taken in response to documented harms
Creating a coordinator to engage with and educate these communities about making use of the DSA out-of-court settlement
Creating event- and content-based opportunities to inform the public about using the DSA.
Without these steps, the Digital Services Act risks functioning as a symbolic framework rather than a tool for accountability.