Roblox is a hugely popular online gaming platform that contains its own little universe of games. It is primarily targeted at children.
Users can create games themselves and play games built by others, and chat with other players. If you’re a parent, you’ve likely heard of it. You might even be bribing your kids with Roblox gift cards to do chores.
Roblox has reported 79.5 million average daily users playing Roblox games in the second quarter of 2024. These users spent 17.4 billion hours on the Roblox platform, which means just over a whopping 200 hours per user per year, or around 40 minutes per day.
While kids can engage in fun digital play and even learn basic programming on Roblox, they may also have troubling encounters, such as seeing sexually explicit content, experiencing grooming or cyberbullying. Those troubling experiences for children are the official reason why Turkey banned access to Roblox on August 7, say Ausma Bernot, Griffith University and Joel Scanlan of the University of Tasmania
Why did Turkey suddenly ban Roblox?
In a machine-translated statement on X, Turkey’s Justice Minister Yılmaz Tunç said that “according to our constitution, our state is obliged to take the necessary measures to ensure the protection of our children”.
The block was handled by the Adana 6th Criminal Court of Peace. It cited an internet governance law Turkey enacted in 2007 and last updated in 2020 to immediately ban access to Roblox.
The law outlines quick-moving consequences. If a court determines content on an online platform as unlawful, the president of the Information Technology and Communication Authority then has 24 hours to review the decision. If a ban is decided upon, it has to be implemented within just four hours.
This is likely bad news for Roblox. Turkish authorities determined “the infringement could not be prevented by technically banning access to the infringing content”. They blocked access to the whole site instead.
Roblox issued a statement noting they are working with local authorities with the goal of resolving the ban. Specifically, Roblox will need to prove their content can be moderated in such a way that children would not be harmed.
Turkey also recently banned Instagram, and has discussed plans to also ban TikTok. Instagram was banned for nine days due to allegedly blocking condolence posts following the assassination of Hamas leader Ismail Haniyeh.
How safe are kids playing Roblox?
The scale of the troubling content problem on Roblox and other large tech platforms is hard to quantify. Monitoring and reporting on these large social media and gaming platforms is often opaque.
The real-life accounts of parents about their children’s experiences are sobering, fuelling criticism that Roblox is not sufficiently monitoring the content that ends up in children’s message boxes.
In a BBC report earlier this year, an eight-year-old boy revealed that people he met on Roblox asked him for nude photos. At least 20 people have been detained in the United States since 2018 on charges of harassing or kidnapping people they met on Roblox.
With 42 per cent of Roblox users under the age of 13, it is obvious why the platform is targeted by such offenders. This raises understandable concerns for parents (and Turkey) about the risks posed to children.
Roblox does try to moderate the content on its platform. In 2023, Roblox made 13,316 reports to the National Center for Missing and Exploited Children, up from 2,973 in 2022. However, such numbers, when the monthly active user base for the platform exceeds 200 million users, are received with some scepticism.
The problem of grooming and child abuse material is not unique to Roblox, either. There have been recent efforts by other platforms to be more open about their monitoring, and measures to protect children. This includes partnering with trusted third parties to maximise the impact of such initiatives.
Earlier this year Aylo (owner of Pornhub) participated in an evaluation of a deterrence campaign on their platform. In the United Kingdom, Project Intercept by the Lucy Faithful Foundation is working with tech platforms to stop online sexual abuse of children.
What can we do about preventing digital harm to kids in Australia?
Australia’s eSafety Commission holds the view that we “can’t regulate our way out of online harms”. Instead, it demands safety by design, where companies are encouraged to invest in risk mitigation at the front end.
Safety by design, transparency and collaboration with external organisations are key to build trust in the platforms, and to also enable regulators to create effective policy.
Interestingly, Roblox is doing this work already. Roblox has joined the Tier 1 social media program of the Australian eSafety Commissioner, which enables the commission to delete reported information more quickly.
In 2019, Roblox also hired a director of digital civility to figure out how to make the platform a secure place for kids to play. The director has since launched a free digital civility curriculum and a Digital Safety Scavenger Hunt that teaches kids about safe and unsafe in-game interactions, such as account baiting.
However, the current issues speak to the importance of this work, and the need for strong regulator engagement by tech firms. We need continuous improvement of monitoring systems industry wide. The current statutory review of the eSafety Commissioner is an opportunity for expanding the office’s ability to act in relation to platforms at this scale to protect Australians.
Ausma Bernot, Lecturer in Technology and Crime, Griffith University and Joel Scanlan, Senior Lecturer, College of Business and Economics, University of Tasmania
This article is republished from The Conversation under a Creative Commons license. Read the original article.