
Roblox and Discord Under Fire for Child Safety Failures
Two of the internet’s biggest playgrounds, Roblox and Discord, are officially in the hot seat over child safety — and it looks like things could get messy.
State officials in Florida and New Jersey are launching serious investigations into how these platforms protect (or fail to protect) their young users. Roblox, which clocks an insane 85 million daily players, and Discord, with 200 million monthly users, are now facing legal pressure to explain how they're actually keeping kids safe, not just promising they are.
"Florida Attorney General James Uthmeier has subpoenaed Roblox Corp. for information about how it markets to children and its anti-predator policies."
The focus is on whether Roblox is doing enough to prevent interactions between kids and potential predators. According to Uthmeier’s office, they want receipts: moderation policies, how age verification works (or doesn't), and how easy it is for kids to dodge parental controls.
Roblox responded, saying:
"We intend to cooperate with the Attorney General’s office and look forward to sharing all the work Roblox does to help keep users safe."
Good PR statement, but the court of public opinion is already buzzing — and not in a good way.

Image: Discord
Discord Issues
If Roblox is catching heat, Discord might be standing even closer to the fire.
New Jersey Attorney General Matthew Platkin just sued Discord straight up, accusing the company of misleading families about how safe its app really is. The lawsuit claims Discord promised protective features like message filters but didn’t deliver. Kids were still exposed to horrific stuff — we're talking child sexual abuse material, violent content, and even terror-related media.
"Discord misled users and parents by overstating its safety features and failed to scan messages even between friends," the lawsuit says.
Yikes.
Even more damning, the suit says Discord doesn't properly enforce its own minimum age rule (supposedly 13+), making it super easy for younger kids to slip through the cracks.
Discord’s official response? Pretty defensive:
"Given our engagement with the Attorney General’s office, we are surprised by the announcement that New Jersey has filed an action against Discord today."
Translation: "Wait, you’re suing us? We thought we were good?"

Why It’s Blowing Up Now!?
Both platforms have been trying to fix things, at least on paper.
Roblox claims they rolled out 40 safety updates in 2024 alone, including a major one in November that blocks kids under 13 from entering social spaces like virtual clubs. Discord, for its part, had its CEO Jason Citron testify before Congress last year about child safety — right as the company gears up for a possible IPO.
But for a lot of people (and a lot of parents), it feels like too little, too late.

Bloomberg’s Businessweek reported last summer that Roblox had serious issues keeping predators off its platform. Since 2018, US police have arrested over two dozen individuals accused of grooming or abusing victims they met through Roblox. And insiders say the problem is huge, partly because of the sheer scale of these platforms.
"Employees at Roblox described the breadth of the challenge of policing the company's huge user base."
Basically, even with all the AI, mods, and reporting systems in place, bad actors still slip through.

“Roblox, Discord tomorrow”
Both Roblox and Discord are staring down serious legal trouble, and public trust is definitely on the line.
If Roblox can't show that its systems are airtight, it risks more regulation (and angry parents pulling their kids out). For Discord, the lawsuit could not only delay its IPO but also tank its reputation just as it’s trying to expand beyond gaming circles into a broader social platform.
There's also the bigger question of accountability. How responsible should platforms be when they act more like playgrounds than traditional social networks? And how realistic is it to fully police spaces with hundreds of millions of users?
These cases could set major precedents for how apps manage child safety going forward, not just for Roblox and Discord, but for the entire internet.
This isn’t just about two companies getting in trouble. It's about who’s responsible when virtual playgrounds turn dangerous.
Comments