Hot on the heels of Thursday’s first reading of the UK’s Online Safety Bill comes Meta’s announcement that it will let parents place limits on their children’s exploration of the virtual world.
Back in February, as part of BBC news investigation, BBC researcher Jess Sherwood entered VRChat, an online virtual world platform that allows users to interact with others within user-created 3D avatars and worlds.
Jess was able to create a fake profile and, posing as a 13-year-old girl, was exposed to sexual content, racist insults and a rape threat. The reporter also saw avatars simulating sex and was propositioned by numerous men.
While VRChat has been praised for helping people with autism improve their public speaking and in enabling human interaction during Covid-19, the BBC investigation highlighted clear issues around children being exposed to harmful content.
Following the BBC News investigation, the British child protection charity, the National Society for the Prevention of Cruelty to Children (NSPCC) raised concerns that some Metaverse applications are "dangerous by design" and that improvements in online safety were required as a matter of urgency.
Meta are now rolling out tools that allow parents to stop children accessing certain apps and blocking them from downloading or purchasing age-inappropriate apps in the Quest Store. They are also releasing a Parent Dashboard, to allow parents to link to their child's account.
The Online Safety Bill is intended to protect children from harmful content such as pornography, to limit people’s exposure to illegal content, while protecting freedom of speech, and require social media platforms, search engines and other apps and websites allowing people to post their own content to protect children, tackle illegal activity and uphold their stated terms and conditions.
The question is “are the tech giants’ doing enough to protect children from inappropriate content?”