Content note: This article discusses cases involving minors being solicited for sexual activity, receiving sexually explicit messages, and self-harm.
To what extent can Roblox Corporation and the messaging platform Discord be held legally responsible for unlawful behavior carried out by users on their services?
That question sits at the center of a growing number of lawsuits facing both companies, many of which have been brought by the law firm Anapol Weiss. The firm represents families whose children were allegedly targeted by predators on Robloxโindividuals who, in some cases, encouraged minors to move conversations to Discord, where the abuse reportedly escalated from online exploitation to in-person harm.
Both Roblox and Discord have consistently denied claims that their safety practices are insufficient. They are also entering court proceedings with legal precedent largely on their side, thanks to a federal statute known as the Communications Decency Act. Still, given the stakes for child safety, the question remains: how does the law actually apply in these cases?
Section 230 generally provides broad immunity to companies that host user-generated content, and demonstrating that this protection does not apply can be challenging.
Originally enacted in 1934 and significantly revised in 1996, the Communications Decency Act includes a provision known as Section 230. This clause grants limited immunity to โproviders and users of interactive computer services,โ shielding platforms from liability for content posted by their users. In practical terms, this means that if a user posts defamatory content on Facebook, the individual can be suedโbut Meta generally cannot.
Prior rulings such as Jane Doe v. America Online Inc. and M.A. v. Village Voice have helped establish the framework relevant to the lawsuits against Roblox and Discord. In both cases, plaintiffs accused platforms of aiding or enabling the sexual abuse of minors, but federal courts ruled that Section 230 granted the companies civil immunity.
Alexandra Walsh, an attorney at Anapol Weiss representing several of the affected families, told a gaming publication that her firm took on the cases to โgive victims a voice,โ a mission she described as central to the firmโs work. โWhat began as a handful of complaints has grown into a wave of litigation,โ she said, โas families nationwide realize theyโve been harmed by the same systemic failures at Roblox and Discord to safeguard children.โ
Walsh argues that Section 230 does not apply to her clientsโ claims. โRoblox invokes it like every tech company does when sued but they are vastly overstating its scope,โ she said. In her view, the statute is intended to limit liability when platforms are merely hosting third-party content, not when they allegedly fail to implement adequate safety systems or misrepresent protections for minors.
She explained that the lawsuits focus on how the platforms allegedly launched and maintained their services without sufficient safeguards, while portraying them as safe for children. According to Walsh, adults were able to create profiles posing as minors, and children could sign up without parental involvement.
โNo safeguard is perfect,โ she acknowledged, referencing tools like age gates or parental email verification. โBut even small barriers create friction. They make some kids stop and think.โ
Walsh also said it is relatively easy for minors to disable Discordโs parental controls without their parentsโ knowledgeโsomething predators can exploit. She suggested that a more robust system would automatically notify parents when those controls are turned off.
The connection between the two platforms is reinforced by Robloxโs Discord integration. In one case cited by Walsh, a Florida-based predator allegedly used Roblox to contact a teenager, then moved the interaction to Discord, where further exploitation occurred.
โRoblox markets itself as a platform thatโs safe and appropriate for children,โ Walsh said. โYet the company knows that child predators are active on the service every day.โ She pointed to Robloxโs regular reports to the National Center for Missing and Exploited Children, along with publicized arrests linked to the platform, as evidence.
Aaron Mackey, an attorney with the Electronic Frontier Foundation, declined to comment on the specifics of the Roblox and Discord lawsuits. However, he noted that courts have repeatedly ruled that communication platforms are not liable for abusive messages sent by users due to Section 230 protections. While counterintuitive, he said, these protections are what allow online moderation systems to function at all.
Combined with an industry-wide pullback from intensive moderation, this legal framework can create gaps that some predators exploit. Walsh rejected the idea that industry trends justify such outcomes. โScaling back moderation is not an excuse for putting children at risk,โ she said.
She added that other companies have implemented measures such as ID-based age verification, default parental approval, and strong deterrents against adultโchild messaging. โAny company branding itself as child-friendly has a non-negotiable duty to prioritize child safety.โ
A Discord spokesperson declined to address the specifics of the lawsuits or whether the company plans to invoke Section 230. They said the platform relies on โadvanced technology and trained safety teamsโ to proactively identify and remove policy-violating content.
The lawsuits follow years of reporting that has scrutinized Robloxโs moderation practices, including concerns over age verification and the presence of sexually explicit user-generated content. Both Roblox and Discord have rolled out new safety features over the past yearโRoblox introduced additional age-check measures this month but plaintiffs argue those steps came far too late.
Courts have also found that Section 230 can apply to claims related to account creation itself. Mackey explained that judges have ruled the ability to create public accounts is inherently tied to usersโ ability to post and interact with content. As a result, lawsuits seeking to restrict or alter those processes can still fall under Section 230 immunity.
While Section 230 is often framed as a barrier to holding platforms accountable, it exists within a broader web of unresolved policy challenges. Law enforcement responses to online abuse have been inconsistent, closed platform ecosystems limit third-party safety tools, and child-safety legislation has drawn criticism for potentially suppressing lawful speech.

Leave a Reply