Mitch Prinstein, a clinical psychologist, and Chief Science Officer for the American Psychological Association is very concerned that with all this talk about and move to the metaverse, we are headed for a disaster. The mental health experts are asking if the metaverse will be a safe place, especially for kids and teens? If we go back in time to Second Life we discovered that it wasn’t that safe for anyone – yet alone teens – but have we not learned anything since its launch back in 2003? Back in September, the Wall St. Journal published those confidential Facebook documents that showed that its platforms, especially Instagram, were harmful to a significant percentage of teens, most notably teenage girls and specifically when it comes to body image issues. Facebook’s internal documents said that among teens that reported suicidal thoughts, 6% of America users traced the issue directly to Instagram. 32 percent of the girls said when they already felt bad about their bodies, Instagram made them feel worse.
Fast track to 2022. Albert Rizzo, a psychologist who serves as the director for medical virtual reality at USC’s Institute for Creative Technologies says that Today’s social media platforms are already dangerous for some kids and teens. Virtual reality’s level of immersion could make those problems even worse. “There’s a potency about being immersed in a world that is different than observing and interacting…through a flat screen monitor,” Rizzo says. “Once you’re actually embodied in a space, even though you can’t be physically touched, we can be exposed to things that take on a level of realism that could be psychologically assaulting.” He says that this is creating more loneliness. This is creating far more body image concerns [and] exposure to dangerous content that’s related to suicidality. One game publisher, VRChat, already shows evidence of dangers for young users. In December, research from the nonprofit Center for Countering Digital Hate (CCDH) found that minors were regularly exposed to graphic sexual content, racist and violent language, bullying and other forms of harassment on VRChat’s platform, which is typically accessed through Meta’s Oculus headsets. CCDH CEO Imran Ahmed told CNBC that “Virtual reality really does need a lot of safety built in from the start, because you can’t search [the metaverse] for hate or sexual abuse,” he says. “You can’t. It happens in an instant [and] there’s nothing you can do.” Topline is as the Metaverse is still in its infancy, we must build a metaverse responsibility doctrine and enforce it. Our kid’s mental well-being and future is at stake.