Roblox has announced new measures to improve child safety on its platform, including blocking children under 13 from sending direct messages to other users. This change is part of the company’s ongoing efforts to better protect young players online.
Under the new rules, children will no longer be able to send private messages unless a verified parent or guardian gives permission. Parents will also have more control over their child’s account, allowing them to view their child’s list of friends, manage their gameplay time, and monitor what content they can access.
Roblox, a popular gaming platform, is used daily by millions, with a significant number of its users being children aged 8 to 12. According to Ofcom research, Roblox is the most popular gaming platform among children in the UK. However, the platform has faced calls to improve its safety features as it grows in popularity.
Changes to Messaging and Parental Controls
The changes will be gradually rolled out starting Monday, with full implementation expected by March 2025. While children will still be able to participate in public chats within games, they will not be able to send private messages without parental consent.
Matt Kaufman, Roblox’s chief safety officer, emphasized the company’s commitment to safety. "As our platform has grown, our approach to safety has had to evolve," he said. "Our goal is to keep all users safe, no matter what age they are." Kaufman also pointed out that thousands of Roblox employees are dedicated to safety, with 10% of the workforce focused on this area.
New Features for Parents
In addition to blocking messaging, the changes will offer parents more tools to monitor their child’s activity on the platform. Parents will be required to verify their identity through government-issued ID or a credit card to access these parental controls. Kaufman urged parents to ensure their child’s age is correctly set up on their account.
Roblox is also simplifying the way it labels content on the platform. It will replace age recommendations for games with content labels that outline the nature of the experience. These labels will help parents make informed decisions about what their children can play based on the maturity of the content, rather than their child’s age.
Content Labels and Restrictions
For example, games will be labeled as “minimal,” “moderate,” or “restricted,” depending on the level of mature content. Children under nine will only be able to access “minimal” content by default, though parents can give permission for "moderate" games. "Restricted" content will be accessible only to users over 17 who have verified their age.
The new rules also include a ban on children under 13 from accessing "social hangouts" where they can text or voice chat with other players. Beginning December 3, game creators will need to indicate whether their games are appropriate for children. Games that don’t meet the guidelines for under-13s will be blocked from young players.
Response to the UK’s Online Safety Act
These changes come as the UK prepares to enforce new regulations under the Online Safety Act, which aims to protect children from harmful and illegal content online. Ofcom, the UK’s media regulator, has warned that platforms like Roblox will face penalties if they do not meet the requirements for keeping children safe. The watchdog will release its full codes of practice for companies in December.
With these new safeguards, Roblox is taking significant steps to enhance the safety of its younger audience, addressing concerns over privacy and online interactions as the platform continues to grow.