Following the news that a researcher posing as a 13-year-old was able to access adult content on the Metaverse, it’s clear that metaverse creators will need much more effective age verification to protect children from harm, and thus create trust in their platforms.
It’s possible to eradicate the age-old trick of simply using a fake identity that claims to be older than you are, argues cyber security experts. It will take new, powerful verification technologies in the sign-up process and continuously throughout the use of the platform.
Unfortunately this is just one of the plentiful ways in which children are able to access inappropriate and dangerous content using the Facebook/Meta platform. Meta have never taken any measures to actively enforce the age restrictions they profess to abide by on their platforms as it would impact their business model and profits. Rather, they pay lip-service to regulators across the globe and hide behind ineffective policy changes and press releases to stave off any credible interventions on their part. The only defence here is for those adults with responsibility for young people in any capacity to do their best to understand how the children they care for are using the internet and try to keep them safe. There is little chance Meta are going to help any time soon.
The Metaverse has enabled users to create new situations and locations in a fantasy realm but unfortunately this has led to its creativity being abused by a number of users who will ultimately ruin it for younger audiences. The Metaverse is still very young and regulations are far from being ready to be able to police this new virtual world. Meta must design the platform with security, privacy and safety in mind but sadly profits are clearly dominating at present whilst it finds its feet. If younger users are tipped to be the generation to take it on then Meta must enforce better protection and safety measures for all audiences.
Fake news, malicious actors, adult content. There are many risks posed to kids by the metaverse or any environment where people hide behind avatars. Key to preventing children from accessing content which could be harmful to them is having age verification processes which work. To weed out fakes identities from the real people, metaverse providers will need strong identity verification, both in the sign-up process and continuously as the platform is used, to avoid situations where a child might access an adult’s account. This can be achieved through deploying facial recognition technologies. Biometrics are the key here. Biometrics provide value through the security they bring, which includes matching the person behind the avatar to a genuine identification document. Next, liveness technologies check that the face is indeed present behind the camera and not spoofed in paper, mask or electronic form. This should be part and parcel with a non-intrusive and frictionless user experience that makes people’s lives easier (avoiding PINs, passwords and multi-step security checks). Finally, transparency must give control of the biometrics to its rightful owners – the people themselves.