A place were I can write...

My simple blog of pictures of travel, friends, activities and the Universe we live in as we go slowly around the Sun.



December 15, 2022

Governance?

An alternate future for metaverse governance

By DEREK ROBERTSON 

Meta currently has its share of headaches in the real world: Layoffs, investor scrutiny, a lawsuit from the FTC.

And now it has a new one in the virtual world: Governing the metaverse, and the unpredictable people who populate it.

Last month I wrote about a report from a University of California Berkeley researcher that criticized the company for inconsistent, ambiguous enforcement of its community guidelines in virtual worlds, especially when it comes to defining acceptable user behavior.

When a user logs onto Meta’s Horizon Worlds virtual space, they agree to a set of rules laid out by the company for that world specifically — but those rules are somewhat limited and vague, the report’s author Rafi Lazerson wrote, without clear demarcation of where they overlap with or are superseded by Meta’s already-established rules for 2D social media. When it comes to getting people to actually use the digital spaces Meta is spending many millions of dollars on, clarity around those rules matters. What kind of world does Meta want to build to justify its investment, and how did they decide on the rules that will govern it?

“The initial assumption for some people in the metaverse product group was that what we’ve done with trust and safety and integrity for other platforms… we can apply those models to the metaverse,” said Zvika Krieger, the former head of Meta’s Responsible Innovation team who worked with multiple Reality Labs product teams. “A number of folks early on had a bias toward the idea that we already know how to do this, so we’ll just do it in the way we know. But very quickly the the scales fell from our eyes.”

Krieger cited a February article from Buzzfeed’s Emily Baker-White titled “Meta Wouldn’t Tell Us How It Enforces Its Rules In VR, So We Ran A Test To Find Out” as a crisis point for the teams writing the rules for Meta’s metaverse. In the article, Baker-White describes purposely creating a world within Horizon Worlds that broke numerous rules that existed on Facebook, specifically bans on QAnon and conspiracy-related content, only for subsequent complaints to moderators to go unheard.

“It showed how even for the level of content moderation that we did on Instagram and Facebook, we didn't have effective systems in place to do it in Horizon Worlds,” Krieger said.

That’s obviously a problem. But the metaverse has, well, additional problems: Personal harassment, when it happens, can feel much more intense and pervasive in an “embodied” virtual space. And when most communication is done verbally, as opposed to via text, it’s harder to censor forbidden words or phrases, or to stop harassment before it happens.

Meta is currently trying to combat that kind of harassment with various design tools, like allowing users to create impermeable virtual “boundaries,” or introducing the ability for parents to monitor their teens’ use. (Only users 13 and up are allowed to use Quest’s app store, and Horizon Worlds is meant for users 18 and up.)

Krieger argues that that’s actually one of the key differences between Meta’s, as of now, relatively under-utilized Horizon Worlds and more robust metaverse-like spaces like Roblox or Minecraft, each of which boasts tens of millions of daily active users. Those games, as he described to me in an interview last week, are built from the ground up to be kid-friendly. There’s no push-and-pull, like on Elon Musk’s Twitter, between adult agency and the demands of creating a friendly community, because there is no adult agency, or at least not the expectation of it.

So how do you recreate the general good vibes and harmony of those kid-friendly platforms in a space meant for people with (at least notionally) a working understanding of the First Amendment? Krieger, who’s no longer employed by Meta and currently consulting with other companies attempting to build virtual worlds, says they might need to, in effect, reverse-engineer those spaces: Not primarily through censorship, but through incentivizing the kind of community bonds and cooperative behavior that video games inherently inspire.

“How do you create a culture where users feel a sense of ownership and responsibility for perpetuating that culture? If you know what the platform is asking of you, then a user can feel empowered to say, ‘Hey, I just saw behavior that doesn't align with this,’ and they feel more empowered to call it out,” Krieger said, also mentioning the idea of offering rewards for calling out other users’ good behavior.

“A lot of this sounds like hokey, kindergarten-classroom type stuff, but what's nice about working with gaming companies that work with kids is that it’s not such a cheesy thing to do when you’re in a kids’ environment,” he said. “It’s intriguing to me that one very successful corner of the metaverse is being incubated in that space.”

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.