Metaverse Protection with Age Verification and a Panic Button
Last week the BBC reported that a reporter posing as a 13-year girl was exposed to strip clubs, sexual content and rape threats in the Metaverse. The incident happened on VRChat, an app that can be downloaded from the popular Meta Quest 2 VR headset. Within VRChat, users can create their own worlds and interact with others through avatars. The reporter was exposed to naked avatars simulating sex. Following the investigation, the NSPCC said it was “shocked” by the revelations and called for protection from Tech Companies for children within the Metaverse.
The responsibility for Child Safety in the Metaverse is an issue that needs urgent attention. Meta, in response to harassment in the Metaverse, has activated a personal boundary around avatars, and Microsoft owned AltspaceVR have indicated that events should be age limited whilst making its own avatar boundary active by default. But can big tech do more? Simply making a boundary around the avatar does not prevent visual and audible abuse. Nor does it use the technology available to fully safeguard adults and children within the Metaverse.
The technology is available to age verify user avatars within Metaverse platforms and restrict content available to minors to safe to use areas within metaverse platforms. Additionally, areas of the Metaverse could be specifically designed for Children where adults are not allowed access to enable Children to interact with peers and not predators. Safety needs to be built into the Metaverse as part of interoperability, and a cross-platform “Panic Button” could be installed as standard technology throughout the Metaverse.
We envisage the “Panic Button” would immediately remove the avatar from the situation. Teleporting them to a safe area within the Metaverse. Once the avatar is removed from the situation, AI can interact with them, asking them if they are OK or need to speak with a cross-platform “Metaverse Safety Team”. With investment into the Metaverse already topping $140 Billion from Microsoft and Meta alone, it would be a tiny investment from Big Tech to install a panic button supported by a “Metaverse Safety Team” and age restrict areas of the Metaverse not suitable for children.
One reply on “Metaverse Protection with Age Verification and a Panic Button”
[…] announcement also came with a mission statement about protecting children in the Metaverse. Stating that the Lego/Fortnite Metaverse will protect children’s right to play, safeguard their […]