Roblox's daily active users have cratered by 20 million to 132 million in Q1 2026, a stark revelation of the problematic backlash against essential child safety measures like age verification—measures that protect our most vulnerable young players from harm. As a white woman in gaming media, I must acknowledge that this resistance often comes from communities unwilling to prioritize the safety of children, particularly those from marginalized backgrounds who face heightened risks online.

The rollout of AI-powered facial age checks, intended to restrict chat access between age groups and curb predator access to minors, has faced widespread criticism for privacy issues and bypasses—yet these very flaws underscore the urgent need for robust protections, not abandonment. Compounding the DAU decline was Russia's ban on Roblox in December 2025 over child safety and 'extremist' content concerns, sparking rare protests but highlighting global tensions around platform accountability. Meanwhile, Roblox posted $1.4 billion in revenue, proving profits persist even as growth slows for safety's sake—a conversation we desperately need on corporate priorities.

Community forums like Reddit and X are rife with calls to revert the age checks, with users decrying bans for non-compliance and selling verified accounts, but this toxic pushback ignores how predators exploit unverified spaces, endangering kids daily. Roblox's Chief Safety Officer Matt Kaufman has responded to the outcry, yet only 51% of DAU have verified, showing incomplete adoption amid the harm. True progress demands we flag these harmful attitudes and advocate for refined, inclusive safety tools that don't alienate but empower young voices.