Content note: This story contains discussions of children being propositioned for sexual actions and receiving sexual messages, as well as self-harm. How liable are Roblox developer Roblox Corporation. and communications platform Discord for illegal conduct by users on their platforms? That’s the question both companies face in a growing suite of lawsuits, many filed by law firm Anapol Weiss. The firm represents a number of families whose children were targeted by predators on Roblox. Some of these predators encouraged these minors to communicate with them over Discord to sexually exploited them first electronically, then physically. The lawsuits follow years of reporting on how Roblox’s allegedly lax moderation protocols have seemingly enabled child exploitation through a combination of lax age identification protocols the hosting of sexually explicit user-made games. Roblox Corp. and Discord have both introduced a number of safety improvements in the last year (with Roblox unveiling new age check measures landing just this month), but according to some plaintiffs, the companies should have done more to protect users years ago. Last week when quizzed about this topic, Roblox Corp. CEO David Baszucki grew combative with New York Times reporters, pushing back on repeated questions about the company’s safety record. Both companies have repeatedly denied any lax practices. And they’re heading to court with case law seemingly tilted in their favor. That’s because of a federal law known as the Communications Decency Act. But with the safety of so many young players on the line, it’s worth asking-how does the law apply to these companies? Section 230 broadly shields companies that host user-generated content First passed in 1934, the law was updated in 1996 and hosts a clause known as “Section 230,” which provides limited federal immunity to “providers and users of interactive computer services.” It’s shielded telecommunication companies and social media platforms from legal liability for content hosted by its users. For instance if someone on Facebook falsely accuses you of a crime, you can sue that user defamation, but not Facebook owner Meta. These companies also offer civil immunity for removing obscene or terms-of-service-violating content from their platforms from their platforms-even constitutionally protected speech-so long as that removal is done “in good faith.” The law does not provide immunity for criminal violations, state civil laws, and other instances. That may mean it doesn’t apply to suits filed by the states of Florida, Louisiana, and Texas. Cases like Jane Doe v. America Online Inc. and M. A. v. Village Voice have laid out precedent relative to the lawsuits against Roblox Corp. and Discord. In both cases, the defendants were accused of aiding and abetting the sexual abuse of minors, but Federal courts ruled the companies possessed civil immunity under Section 230. Plaintiffs’ lawyers suing Roblox and Discord say this isn’t about hosted content Alexandra Walsh, an Anapol Weis lawyer representing parents suing the company, told Game Developer her firm took on with the intent of “giving victims a voice,” a motivation that’s “at the heart” of the firm. “What started as a few complaints has ballooned into a wave of litigation as families across the country realize they are victims of the same systemic failures by Roblox and Discord to protect their children,” she said. According to Walsh, Section 230 is “irrelevant” to her clients’ claims. “Roblox will invoke it and has invoked it because every tech company automatically invokes it when they get sued,” she said. “But they are grossly-overinterpreting the application of that statute. In our view, that statue is designed to do is to limit liability in cases where an internet service provider is. publishing someone else’s material.” She described how the firm’s cases center on how these apps are released without adequate safety features while purportedly misrepresenting a their safety protections for underage users. Adult predators were able to create profiles signaling that they were children, and children were able to sign up for accounts without going to their parents. Game developers might recognize however, that the phenomenon of underage users signing up for online games or services without a parent’s permission is as old as. well, the internet. When asked about this, Walsh said there was a distinction in how other platforms like Instagram have “some attempt” to enforce their age minimum policies, and how Roblox provides minimal friction for when underage users sign up for the platform. “We’re not saying that any particular measure is going to be perfect 100% of the time,” she said in allusion to age-gates that might, for example, require a parent’s email address to create an account. “But at least it’s some friction stories covering the arrests of predators who targeted minors on their platform as evidence of this fact. Yet despite all that Roblox and Discord may still be protected by Section 230 in these civil cases. Proving Section 230 doesn’t apply could prove difficult Electronic Frontier Foundation attorney Aaron Mackey-director of the nonprofit’s free speech and transparency litigation efforts-acknowledged it’s challenging to differentiate between responsibility and liability when it comes to protecting kids online. The Foundation has been a strong advocate for Section 230, arguing that while some elements of the Communication Decency Act were flawed, the law has provided vital protections for freedom of speech on the internet. Mackey declined to comment on the specifics of the cases against Roblox Corp. and Discord. But in a conversation with Game Developer, he explained communication platforms of all stripes have been repeatedly found not liable for abusive messages sent on their platform because of Section 230. It may sound counterintuitive, but these protections enable the existence of any online moderation. Before Section 230’s existence, internet service providers CompuServe and Prodigy both faced lawsuits for their policies about moderating what users posted on their servers. The former company said it would not moderate any content, while Prodigy said it would. Both were sued, and Prodigy was the one to be found liable for content hosted on its servers even though it was the one with a moderation policy. Mackey said the law was created to let services to decide for themselves about what kind of speech to allow on their platform, and offer protections when they enforced those policies. That raises the bar for civil suits about messages sent between users. There appear to also be protections for generic promises about child safety on Roblox and Discord. “There are cases in which plaintiffs have tried to raise this claim, which is that they’re not seeking to hold [platforms] liable for the content of the communication but for representations about what they would do to protect users,” he said. “Those cases have not succeeded.” The courts have also ruled that Section 230 provides immunity for claims that cover the account creation process. “The courts ruled that 230 applied because the services decision to offer public accounts was inherently connected with the ability for account holders to create, view, share content on the service,” Mackey said. “A legal claim that sought to change or limit the service’s ability to have the account-creation process it wanted would implicate 230 because it necessarily seeks to impose liability based on the third-party content on the site. The cases that have succeeded centered on specific promises made by online platforms to specific users. Mackey recalled a case reviewed by the Ninth Circuit about a user who faced online abuse, asked the platform owner for help, was promised assistance, and then the company took action. The Court ruled that section 230 did not apply to the case because it involved the failure of a service to follow through on its promise. How can online platforms improve child safety? It’s tempting to view Section 230 as an obstacle for holding online platforms accountable for user safety-but there’s a larger patchwork of policy gaps that led to this complicated status quo. Law enforcement has been slow to act on all manner of online threats. The closed ecosystems or Roblox and Discord prevent other companies from offering third-party safety tools to parents. And laws shaped around online “child safety” have been sharply criticized for their potential to block all manner of undesired speech. Pair that with a global retreat in online moderation and you create a porous online ecosystem that stops some predators-but lets others slip through the cracks. “A general industry trend of scaling back moderation would be an abhorrent excuse for putting children in harm’s way,” Walsh said to Game Developer. “Other companies have successfully implemented common-sense safety mechanisms like ID age verification, mandatory parental approval by default, and robust deterrents to prevent messaging between children and adults. Corporations marketing themselves as child-friendly have a non-negotiable responsibility to prioritize child safety.” When reached for comment, a Discord spokesperson declined to discuss on the specifics of these cases and if they planned to invoke Section 230 in their defense. “We use a combination of advanced technology and trained safety teams to proactively find and remove content that violates our policies,” they said. Roblox Corp. did not respond to multiple requests for comment.
https://www.gamedeveloper.com/business/are-roblox-and-discord-protected-from-civil-liability-under-section-230-