Concerns Rise Over Safety of Children on Roblox Platform

Concerns Rise Over Safety of Children on Roblox Platform

Roblox is an online gaming community with millions of games and immersive 3D environments. It has come under more and more criticism for the safety—or lack thereof—of its young users. The platform markets itself as “the most comprehensive virtual universe.” It has acknowledged that children using its services may be exposed to dangerous content and encounters with “bad actors.” Now with over 85 million daily active users, the platform boasts a large youth presence, with nearly 40% of users under 13 years old. This underlines the desperate need for new, comprehensive safety measures.

Roblox has made strides to enhance protections for its millions of young users. In November, the social media platform formerly known as Twitter changed its settings. Now, accounts set to “under 13” can’t send direct messages to others not in games or experiences, either. With these new limitations, these underage accounts are only able to see messages from public-wide broadcast accounts. As an example, forward-looking examples, real-time AI moderation for voice chat is being deployed by Roblox. This privilege is granted only to phone-verified accounts registered to users aged 13+.

Despite these efforts, experts are cautioning that the safety controls still in place are inadequate. Digital-behavior experts from Revealing Reality have pointed out that “safety controls that exist are limited in their effectiveness and there are still significant risks for children on the platform.” This criticism points to what is perhaps the largest obstacle standing between Roblox and a secure experience for their majority underage user base.

User generated content—largely the draw of Roblox, the platform’s vast library of user-made games—creates even more risks. Although Roblox creates a small percentage of its content, the overwhelming majority is created by its community of players turned creators. This user-generated nature complicates moderation efforts, as filters and moderation can be easily circumvented, allowing predatory behavior to potentially thrive.

Roblox’s Chief Safety Officer, Matt Kaufman, emphasizes the company’s commitment to safety: “Trust and safety are at the core of everything we do.” He added that, “We’re always improving our policies, technologies and moderation processes to make our platform the safest community for young people.” Kaufman underscored DoorDash’s long-term commitment to invest in more advanced safety technology. Equally important, they’re empowering parents and caregivers with effective parental controls and age-appropriate educational resources.

The company acknowledges that age verification for users under 13 is not just difficult, but an insurmountable task. This incident points to a deeper issue pervading the online gaming and social media environment. Beeban Kidron, a crossbench peer and internet safety campaigner, has highlighted the need for routine user research in platforms like Roblox to ensure user safety: “This kind of user research should be routine for a product like Roblox.”

Roblox recently introduced new tools designed to provide parents with more oversight of their kid’s accounts. These features enable parents to easily keep tabs on their kids’ interactions and restrict their kids’ gaming experiences. Critics are sounding the alarm over these measures. They think they will be inadequate in mitigating the harms associated with user-created content and freeform interactions.

Tags