Governing Virtual Reality Social Spaces

by on March 5, 2018 · 0 comments

“You don’t gank the noobs” my friend’s brother explained to me, growing angrier as he watched a high-level player repeatedly stalk and then cut down my feeble, low-level night elf cleric in the massively multiplayer online roleplaying game World of Warcraft. He logged on to the server to his “main,” a high-level gnome mage and went in search of my killer, carrying out two-dimensional justice. What he meant by his exclamation was that players have developed a social norm banning the “ganking” or killing of low-level “noobs” just starting out in the game. He reinforced that norm by punishing the overzealous player with premature annihilation.

Ganking noobs is an example of undesirable social behavior in a virtual space on par with cutting people off in traffic or budging people in line. Punishments for these behaviors take a variety of forms, from honking, to verbal confrontation, to virtual manslaughter. Virtual reality social spaces, defined as fully artificial digital environments, are the newest medium for social interaction. Increased agency and a sense of physical presence within a VR social world like VRChat allows users to more intensely experience both positive and negative situations, thus reopening the discussion for how best to govern these spaces.

When the late John Perry Barlow, the founder of the Electronic Frontier Foundation, published his declaration of the Independence of Cyberspace in 1996, humanity stood on the frontier of an online world bereft of physical borders and open to new emergent codes of conduct. He wrote, “I declare the global social space we are building to be naturally independent of the tyrannies [governments] seek to impose on us.” He also stressed the role of “culture, ethics and unwritten codes” in governing the new social society where the First Amendment served as the law of the virtual land. Yet, Barlow’s optimism about the capacity of users to build a better society online stands in stark contrast to current criticisms of social platforms as cesspools of misinformation, extremism, and other forms of undesirable behavior.

As the result of VRChat’s largely open-ended design and its wide user base from the PC and headset gaming communities, there is a broad spectrum of user behavior.  On one hand, users experienced virtual sexual harassment and the incessant trolling of mobs of poorly rendered “echidna” consistent with the Ugandan Knuckles meme. However, VRChat is also the source of creativity and positive experience including collective concerts and dance parties. When a player suffered a seizure in VRChat, players stopped and waited to make sure he was okay and sanctioned other players who were trying to make fun of the situation. VRChat’s response to social discord provides a good example of governance in virtual spaces and how layers of governance interact to improve user experiences.

Governance is the process of decision-making among stakeholders involved in a collective problem that leads to the production of social norms and institutions. In virtual social spaces such as VRChat, layers of formal and informal governance are setting the stage for norms of behavior to emerge. The work of political scientist Elinor Ostrom provides a framework through which to understand the evolution of rules to solve social problems. In her research on governing a common resource, she emphasized the importance of including multiple stakeholders in the governing process, instituting a mechanism for dispute resolution and sanctioning, and making sure the rules and norms that emerge are tailored to the community of users. She wrote, “building trust in one another and developing institutional rules that are well matched to the ecological systems being used are of central importance for solving social dilemmas.” Likewise, the governance structure that emerge in VRChat is game-specific and dependent on the enforcement of explicit formal and informal laws, physical game design characteristics, and social norms of users. I delve into each layer of governance in turn.

At the highest level, the U.S. government passed formal laws and policies that affect virtual social spaces. For example, the Computer Fraud and Abuse Act governs computer-related crimes and prohibits unauthorized access to user’s accounts. Certain types of content, such as child pornography, are illegal under federal law. At the intersection of VR video games and intellectual property law, publicity rights govern the permissions process for using celebrities’ likenesses in an avatar. Trademark and copyright laws determine limitations on what words, phrases, symbols, logos, videos or music can be reproduced in VR and what is considered “fair use.”

Game designers and gaming platforms can also employ an explicit code of conduct that goes beyond formal federal laws and policies. For example, VRChat’s code of conduct details proper Mic etiquette and includes rules about profanity, sexual conduct, self-promotion and discrimination. Social platforms rely on a team of enforcers. VRChat has a moderation team that monitors virtual worlds constantly. External reviewers look at flagged content and in-game bouncers monitor behavior in real time and remove the bad eggs.

By virtue of their technical decisions, game designers also govern the virtual spaces they create. For example, the design decision to put a knife or a banana in a VR social space will affect how users behave. VRChat has virtual presentation rooms, court rooms and stages that prompt users to do anything from singing, to stand-up comedy to prosecuting other users in fake trials. Furthermore, game designers can include in-game mechanisms to empower users to flag inappropriate behavior or mute obnoxious players, a function that exists in VRChat.

Earning a reputation for malfeasance and poor user experience is bad business for VRChat, so the company recently re-envisioned their governance approach. They acknowledge their task in an open letter to their users: “One of the biggest challenges with rapid growth is trying to maintain and shape a community that is fun and safe for everyone. We’re aware there’s a percentage of users that choose to engage in disrespectful or harmful behavior…we’re working on new systems to allow the community to better self-moderate and for our moderation team to be more effective.” The memo detailed where users could provide feedback and ideas to improve VRChat, suggesting that users can be actively involved in the rule-making process.

In Elinor Ostrom’s nobel-prize lecture she criticizes the oft-made assumption that enlightened policymakers or external designers should be the ones “to impose an optimal set of  rules on individuals involved.” Instead, she argued that the self-reflection and creativity of those users within a game could serve “to restructure their own patterns of interaction.” The resulting social norms are a form of governance at the most local level.

Ostrom’s framework demonstrates that good social outcomes emerge through our collective actions, which are influenced by top-down formal rules from platforms and bottom-up norms from users. The goal of stakeholders involved in social VR should be to foster the development of codes of conduct that bring out the best in humanity. Governance in virtual worlds is a process, and players in social spaces have a large role to play. Are you ready for that responsibility, player one?

Previous post:

Next post: