Introduction
In the aftermath of the Rapid Support Forces’ (RSF) takeover of Al-Fashir, Darfur’s humanitarian catastrophe has escalated into one of the darkest chapters of Sudan’s ongoing war. After nearly 18 months under siege, the city’s fall two days ago unleashed widespread massacres, looting, and systematic violence against civilians.
Amid this brutality, a disturbing parallel front has emerged online. TikTok has become a central platform for celebrating atrocities, glorifying mass killings, and spreading RSF propaganda.
One of the most notorious figures utilizing this trend is an RSF fighter known as “Abu Lulu.” He has appeared multiple times in widely shared videos on different platforms personally shooting civilians in Darfur, and he is on TikTok using it as his stage for boasting about these crimes. In a recent live session where a snippet of it was shared on Facebook, Abu Lulu claimed to have killed over 2,000 people, saying he “stopped counting after that.”
The RSF, which the United States Government formally declared in January 2025 to have committed genocide in Darfur, continues to operate freely online, with this case exemplifying a broader pattern: TikTok Lives have turned into virtual rallies for RSF fighters, commanders, and their supporters, where violence is not condemned but celebrated.
TikTok Live as a Space for Glorifying Atrocities
TikTok Live has become the most active online arena for RSF-linked propaganda and atrocity celebration. Known RSF soldiers, commanders, and supporters frequently appear in these live sessions, often wearing RSF uniforms boasting about ongoing attacks.
These livestreams attract thousands of viewers, many of whom send virtual gifts and comments expressing admiration. Crucially, these sessions do not disappear: supporters record them, edit them into short clips, and re-upload them across TikTok, Facebook, X (formerly Twitter), and Telegram, creating an endless cycle of glorification and viral amplification.
Through this ecosystem, mass violence is reframed as entertainment, and the perpetrators gain celebrity-like status within RSF-aligned digital communities. The line between military propaganda, digital performance, and real-world atrocities has completely blurred.
A Warning Ignored: Sudalytica’s May 2025 Findings
This phenomenon is not new. In May 2025, Sudalytica and Beam Reports published a comprehensive investigation titled “From Indonesia to Sudan… Hate speech and war propaganda for profit.”
The report exposed a network of over 50 Facebook pages and dozens of TikTok accounts that monetized hate speech and inflammatory war content directed at Sudanese audiences. It revealed how TikTok Live had become a profit-driven incubator of hate, where users exploited “gift” features and algorithms that reward violent engagement.
Sudalytica’s earlier analysis explicitly warned that TikTok had turned into the central source of conflict-related hate content , yet the company took no visible action. The result is what we now witness: the escalation from hate speech to open celebration of mass murder.
TikTok’s Responsibility and Complicity
TikTok’s continued failure to enforce its own content moderation rules in Sudan has allowed the platform to become a digital engine of atrocity normalization. The company cannot claim ignorance; multiple attempts to raise the issue with them were made by our team. Videos and livestreams featuring RSF fighters boasting about the war and spreading hate speech have been publicly circulating for a long time.
By allowing this content to thrive and even algorithmically promoting it through engagement metrics, TikTok bears direct responsibility for enabling and amplifying war crimes. The platform’s negligence effectively turns it into a co-producer of propaganda, helping spread terror, dehumanize victims, and reward perpetrators with visibility and fame.
While TikTok has issued public statements about its handling of other conflicts, such as the Gaza war, during which it announced the removal of more than 925,000 videos related to the conflict and the deployment of additional content-moderation teams, the platform has not released any statement or transparency report regarding the war in Sudan, even though it preceded the Gaza war by several months.
This disparity reveals a deeply troubling double standard in TikTok’s global content-moderation policies: the company acts swiftly and decisively in some conflicts, yet remains completely silent about the ongoing atrocities in Sudan and the dangerous content linked to them on its platform.
The Broader Consequences: From Violence to Viral Fame
The transformation of atrocity into content has profound implications for the Sudanese conflict. Psychologically, it desensitizes audiences and conditions them to view brutality as normal or justified. Socially, it reinforces polarization and cycles of revenge.
On a military level, it incentivizes RSF fighters to commit more violence for digital recognition and validation. The result is a war that is no longer confined to the battlefield — it is digitally perpetuated, emotionally weaponized, and increasingly horrendous with every new viral video.
Conclusion and Recommendations
Beam Reports calls on TikTok and other social media platforms to take immediate action by:
- Removing accounts and live sessions associated with fighters from the warring parties, commanders, or propaganda networks.
- Deploying human moderation teams fluent in Sudanese Arabic and regional dialects to detect hate speech and atrocity-linked content in real time.
- Addressing monetization and live gifting features in conflict-related regions where such systems reward violent content.
- Preserving and sharing digital evidence with human rights monitors and international accountability bodies.
- Collaborating with independent fact-checking and OSINT organizations to identify and counteract coordinated networks.
TikTok’s failure to act promptly makes it a silent partner in the atrocities taking place in Sudan, turning its platform into a space that fuels the war instead of mitigating its impact.
The contents of the mentioned account were archived by the Sudalytica team before this report was published.