User perspectives on identity and safety in social VR

Prof. Dr. Matthias Quent

Professor of Sociology and chairman of the Institute for Democratic Culture at Magdeburg-Stendal University of Applied Sciences. He also heads the Immersive Democracy project.

Sara Lisa Vogl

Cofounder of the NPO Women in immersive Technologies Europe(WIIT), and member of the World Economic Forum’s Global Future Council for VR/AR Sara is collaboratively constructing and exploring VR experiences since 2013. A background in communication arts & interactive media and in love with the idea of new worlds, Sara is on a mission to go beyond the status quo of what immersive virtual realities are and explore their diverse potentials for the future. Website

Background and significance     

Virtual worlds offer unique opportunities for individuals to interact, collaborate, and form relationships in digital spaces. Virtual online spaces have gained significant prominence in the last two decades, through advancements in technologies like virtual reality (VR), augmented reality (AR), artificial intelligence (AI), and general computing power. These digital environments provide individuals with immersive and interactive spaces and scenarios where they can engage in social interactions, explore diverse identities, and experience various forms of relationships. The increasing relevance of social life within virtual worlds presents a unique opportunity to examine the implications and benefits for individuals who spend substantial amounts of time in these online social spaces.

Virtual worlds are computer-simulated environments where users can interact with each other and with the digital environment itself. These digital spaces can take the form of fully immersive virtual reality (VR) environments or less-immersive platforms accessed through computer screens or mobile devices like smartphones. Virtual social spaces offer participants a range of experiences, from gaming and socializing to educational and professional interactions. In this study we are focusing on fully immersive head mounted display driven social interaction. 

The distinguishing features of virtual worlds include:

Immersive Environments: 

Through VR technology, users can engage with three-dimensional environments that simulate real-world experiences or fantastical settings. Virtual worlds can have the power to provide users with a sense of immersion and presence, blurring the line between the physical and digital realms. The interactive nature of VR and the direct feedback of avatar embodiment heightens the immersion in said environments. Higher levels of immersion enhance the feeling of being present in the virtual world, creating more engaging and rewarding experiences in return.

Avatars: 

Avatars are digital representations of users within the virtual worlds. They serve as the user’s virtual personas, allowing them to navigate and interact with the environment and other participants. Avatars can be customized to reflect people’s preferences and can range from human-like representations to imaginative creatures or objects. Avatars play a crucial role in self-expression and identity exploration within virtual worlds. The effect of avatars on people is fundamentally different if experienced in a fully immersed state inside VR or on a 2D screen like a mobile phone.

Social Networking Capabilities: 

Virtual worlds experienced through VR foster social connections by enabling people to interact and communicate with each other very similar to real life. The immersive nature of VR combined with the real-time embodiment of full-body-tracked Avatars enables social interactions that include body language and can facilitate a similar presence like real life social interactions. The platforms usually all provide features such as voice chat, and text chat, allowing users to engage in real time conversations and support the formation of communities, friendships, and professional networks.

The significance of virtual worlds in today’s society is multifaceted:

Social Connection and Interaction: 

Virtual worlds offer individuals a way to connect with others, independent of physical distance or geographical barriers. They provide a platform for socialization, enabling people to meet, socialize, and collaborate with others who share similar interests or goals. Virtual worlds have become important social spaces for forming relationships, creating communities, and fostering a sense of belonging for individuals.

Entertainment and Gaming: 

Virtual worlds offer a wide range of gaming experiences, from massive multiplayer online games (MMOs) to virtual reality (VR) gaming. VR has profoundly impacted the gaming industry, providing immersive and interactive environments where gamers can literally slip into the shoes of the main character. Virtual worlds have become popular spaces for gamers to explore and socialize or compete and engage in cooperative gameplay with others.

Education and Training: 

Virtual worlds and scenarios are widely used in education and training, offering innovative and immersive learning experiences. They provide platforms for simulations, role-playing scenarios, and virtual classrooms, enabling learners to acquire new knowledge and skills in an engaging and interactive way. Virtual worlds can facilitate experiential learning, professional development, and collaborative problem-solving between a wide variety of other factors that might enhance the way we learn and train. 

Creative Expression and Commerce: 

Virtual worlds serve as spaces for creative expression, allowing users to design and build virtual objects, environments, and experiences. Users can engage in virtual commerce, sell virtual goods and services, and monetize their creations. Virtual economies like in Second Life have emerged, where users can buy, sell, and trade virtual assets, fostering a vibrant digital marketplace that is independent from users’ physical locations or socio-cultural backgrounds.

Research and Innovation: 

Virtual worlds have become important research tools for various disciplines, including psychology, sociology, medicine and architecture through the infinite possibilities of simulation.

In summary, virtual worlds offer immersive environments, customizable avatars, and social networking capabilities, providing opportunities for social connection, entertainment, education, creativity, and research. As technology continues to advance, virtual worlds are expected to play an increasingly significant role in shaping how we interact, learn, and experience digital spaces.

Virtual Reality (VR) has evolved in recent years from a niche technology to a widely adopted platform with numerous applications. The modern use of the term VR goes back to the computer scientist Jaron Lanier, who established it in the 1980s (cf. LaVelle 2020:5). According to Steven M. LaValle „The most important idea of VR is that the user’s perception of reality has been altered through engineering, rather than whether the environment they believe they are in seems more ‘real ‘ or ‘virtual’. A perceptual illusion has been engineered.“ (ibid. 2020:6) This immersive technology enables users to enter worlds vastly different from their physical surroundings, opening new dimensions of experience and interaction. Applications of VR systems range from video games, immersive cinema, telepresence (like in Google’s street view) to use in educational areas (cf. ibid. 2020: 9 – 22).

Among the most significant and increasingly relevant applications of VR systems are Social VR platforms. These platforms “[…] provide virtual environments that users can visit and move around in, acting as a public space to meet other individuals to interact with and communicate with. The interest in social VR platforms is steadily growing, and social VR platforms such as VRChat are widely popular among VR users. These digital environments are visited by users through the use of an avatar that acts as a digital body.“ (Stockselius 2023:1).

A leading example of Social VR platforms, and the focus of this article, is VRChat—a widely popular platform that enables users to interact. “VRChat is considered one of the most popular social VR platforms, with the number of users rapidly increasing over the COVID-19 pandemic to 22.000 users per day and over 4 million total users. VRChat was created in 2014 and has raised over $95.2 million in investment, however, remains free for users to play. […] VRChat is less focused on gameplay and more centered around creativity and social interaction.” (Deighan et al. 2023:2). 

Arne Vogelgesang highlights that „Social VR as a practice can be understood as embodied social role-playing in a system of networked and bounded virtual 3D spaces inhabited by avatars connected to human users. Due to technical limitations, a single space on a social VR platform can currently usually accommodate no more than 50 people at a time, which structurally favours the dynamic formation and dissolution of social groups and their localization” (Vogelgesang 2024: 1-2). 

While VRChat offers vast opportunities for creativity and social interaction, it also brings significant challenges. Sabri et al. (2023) discuss these issues in their study ‘Challenges of Moderating Social Virtual Reality’ the difficulties associated with moderating such platforms. The anonymity and deep immersion provided by VR can foster both positive and negative behaviors, posing a challenge for moderators who must ensure a safe and enjoyable environment without stifling user freedom and creativity. These challenges range from combating harassment and abuse to enforcing age restrictions and ensuring privacy. 

These challenges are not unique to VRChat but reflect broader societal concerns about immersive technologies. Matthias Quent (2023) highlights in his analysis of democratic cultures in the ‘next Internet’ that while VR offers significant opportunities at an individual level, it also confronts major social challenges especially at a social level. (cf. Quent 2023: 30; English translation). He sees immersive technologies like VR as offering significant opportunities, such as supporting psychotherapy for anxiety and depression, fostering empathy and inclusion through perspective-taking, and enabling innovative educational experiences. However, they also pose risks, as seen in platforms like Roblox, where user-generated content can promote discrimination, far-right ideologies, and group-focused enmity, violating community standards. (cf. Quent 2023: 38; English translation). 

The societal risks identified by Quent are further underscored by a recent study on the metaverse, which analyzed the experiences of U.S. adolescents. Based on a nationally representative survey of 5,005 US adolescents aged 13 to 17 the researchers found out that 44% of youth reported experiencing hate speech, while 37% faced bullying and 35% harassment in the metaverse. Sexual harassment (18.8%) and grooming or predatory behavior (18.1%) were significant concerns, with girls disproportionately affected. (cf. Hinduja/ Patchin 2024: 6). 

These findings highlight the need for better policies, content moderation, and educational efforts to mitigate the risks associated with the metaverse for adolescents. Given these findings, the need for improved protection mechanisms becomes clear. This topic, along with user perspectives, is explored further in the following interviews.

What is social VR & VR Chat?

Social VR refers to virtual reality platforms that emphasize interpersonal interactions, enabling users to meet, communicate, and engage in shared activities within virtual environments. One of the most popular and pioneering platforms in this domain is VRChat. Launched in 2017, VRChat offers a diverse array of user-generated worlds and avatars, allowing participants to interact in an almost limitless range of settings. Beyond real time voice and text-based conversations, users can partake in activities such as games, karaoke, movie nights, and educational sessions. As of 2023, VRChat has millions of registered users, exemplifying the growing appeal of virtual spaces for social connection and collaborative experiences. Users can access the platform with a PC, with VR glasses and with a mobile app on smartphones (beta version). High-quality VR experiences are based on high hardware requirements, in which VR glasses are connected to powerful computers. The technology is particularly useful for displaying high-quality, high-resolution avatars. Visible levels of detail can be reduced on the software side via settings. Fully immersive experiences require expensive hardware. 

Reference to the Metaverse

The “metaverse” refers to a collective virtual shared space. It’s a vision of a fully immersive digital universe that people can inhabit, work, play, and socialize in, akin to a massive, interconnected digital society. 

Platforms like VRChat and other social VR experiences serve as early indicators or prototypes of what the metaverse might look like in the future. They allow users to transcend physical limitations, enabling them to interact with others in a virtual universe with diverse worlds and activities. In VRChat, for instance, the idea of user-generated content—wherein participants can create their own avatars, design their worlds, and curate their experiences—aligns closely with the broader metaverse vision. Current social VR platforms pave the way for a comprehensive and integrated digital ecosystem, where physical and virtual worlds become increasingly interconnected. As technologies advance and more platforms emerge, the vision of a unified metaverse becomes progressively tangible, with current social VR platforms serving as its foundational building blocks and environments for learning and experimentation. 

Research objectives and questions 

As part of our experimental research, we wanted to find out why people use social VR, what experiences they have had there, particularly with regard to aspects of identity, safety and hate speech, and how they protect themselves from harassment. Questions asked in the interviews conducted inside the social vr platform VRChat included:

  • Which gender do you represent in VR? Does this differ from your real-life gender? 
  • Why do you use VR?
  • What makes you feel safe in VR?
  • What makes you feel unsafe, if anything?
  • Have you experienced harassment or hate speech in VR? 
  • If so, how often does it happen? How does it happen? How have these experiences affected you, perhaps compared to similar expressions on social media or in the real world? How do you deal with it? What helps you deal with it?
  • What ways do you know to protect yourself from harassment or hate in VR? Which of them have you already used? How often? 
  • What other protections would you like or need?
  • How many avatars are on your block list? Are they usually strangers or known people? 
  • What are the consequences of blocking? Are there any negative consequences or irritations as a result?
  • How does it feel to get blocked?
  • Do you perceive political statements in VR? If so, which ones? 
  • Have you noticed extremist or hateful activities?
  • Do you think criminal activities in VR should be prosecuted by police and courts?
  • What should policy makers, regulators or designers know about VR experiences like VRChat?

Research design and data collection methods 

To approach the questions, we conducted four qualitative guided interviews as avatars in VRChat in autumn 2023. The anonymised participants come from different countries and were recruited as a self-selected sample from the virtual network of VR designer Sara Lisa Vogl. The interviews were transcribed, coded according to the key questions and analyzed. The results of the explorative pilot research are limited and cannot be generalized. However, they provide first-hand insights into user behavior and experiences in social VR and thus fill an empirical gap in the primarily theoretical and technological research literature. Further research with more interviewees in virtual worlds can build on this.

The methodology employed involves providing insights into social interaction behavior and interaction design mechanisms within the metaverse, focusing on subjective and qualitative measures to capture the experiences of individuals interacting in different capacities within social online spaces. By employing observation and analysis of these interactions, I am aiming to interpret the measured insights and establish relationships between the observed behaviors and the potential mechanisms that contribute to the described outcomes. 

The results of this data collection will provide insights into concepts that should be evaluated for the design of virtual worlds, facilitating an enhanced understanding around the potential impact of social VR on individuals’ subjective experiences and their resulting perceived abilities in real life contexts.

Through the unique ability of VR to change appearances and virtual bodies completely the interviewees have the opportunity to show up in various identities that are unrelated to their real-life bodies and identities regarding age, gender and abilities. A common trend observed in VRChat is the preference of male-identifying individuals to wear female anime-like Avatars. 

Findings from the interviews

VR Usage of interviewees

To gain insights into the patterns and diversity of VR usage among our participants, and how they relate to their identity and safety issues inside and outside of VR we asked interviewees since when and for what purpose they have been using social VR.

Duration of usage between participants is 2-5 years in social VR, mainly VRChat. 

For some of the participants the experience of social VR is an alternative to real life social interaction while still performing the desired shared activities and feeling connected to friends. Especially for self-described introverts the meetings in social VR have the benefit of added layers of controls over the environment. People can adjust the amount of users visible to them as well as colors and audio of the virtual worlds and avatars. They can also completely blend out other users they do not want to see or interact with. 

Interviewee 1 describes how they can meet friends and have social interactions from the comfort of their home: “It is very good for socializing. […] I don’t need to drive anywhere, and I can hang out with some of my best friends. […] So, I get that feeling of closeness that I can’t really get in the real world right now.”

They go on to describe how they can adjust their VR experience to their preferences and do not have to be exposed to people or situations they have difficulties coping with: “I do not need to interact with people that I don’t want to, and having that extra bit of control over the situation is just amazing for me, especially because I’m an introvert.”

Interviewee 1 highlights the social benefits of VRChat, emphasizing that it allows them to meet and interact with friends from the comfort of their home while having the feeling of as if they are sitting next to each other, watching a movie or engaging in other activities.

Additionally, they value the control VRChat offers over their social environment. They can adjust the volume of individual users, hide overly bright avatars, or block individuals they do not wish to interact with. This control is particularly beneficial for them as an introvert, as it allows them to manage social interactions in a way that suits their preferences and comfort level.

Interviewee 2 mentions how they leverage the opportunities of social VR to combat their real-life social anxiety: 

“My reason for starting playing this game was the way to deal with social anxiety and just talking to people in general. I was really bad at it as I was as a young adult […] bullied in school and stuff, so VRChat was a nice fresh breath of air in terms of being social with people, […].”

They further highlight that in VRChat people get to know each other mainly based on their personality instead of their physical appearance: “[…] The fact that people don’t see you for who you actually are but see you for a personality only. […] I feel like it’s a confidence boost […].”

Interviewee 2 describes that the social interaction in VRChat is like training in a safe environment with positive feedback that encourages them to engage in social interaction:

“I’m more confident interacting with people in general just because of the amount of hours that are put into actually talking to people in this game specifically. […] you just meet strangers from across the world and you start talking to them.”

Interviewee 2 discusses how they use VRChat to cope with social anxiety. VRChat provided a way to practice social interactions in a comfortable environment, helping them to improve their social skills.

They emphasize that in VRChat, people are judged based on their personality rather than physical appearance. This aspect helps bypass insecurities related to looks, offering a confidence boost as interactions are based on the personality projected through their virtual avatar.

Additionally, they describe VRChat as a safe training ground for social interactions. The positive feedback and practice gained from engaging with strangers worldwide in VRChat have increased their confidence in real-life social interactions.

Interviewee 3 describes that their main motive for social interactions in Virtual Reality is to escape reality, meet new people, learn and practice the English language as they aren´t a native speaker: “[…] I meet a lot of new people, make some, make a lot of friends and I start to speak English. I speak bad English, but I started learning and writing and speaking more well than before […]. “

Interviewee 3 uses Virtual Reality primarily to escape from the stresses of reality and to relax. They are pleased with this choice as it allows them to meet new people, make friends, and practice their English skills. 

Interviewee 4 states their social VR usage is mainly driven by the wish to see and interact with people that are too far away to visit in person and highlight the level of immersion through full body tracking in VR: 

“I use VR because it helps in a way to have connections with people who are long distance or overseas and it makes you feel more closer to someone than you ever can be with just desktop games […] you can actually interact and […] you can like cuddle and like hug them and stuff […].

Gender-Identity and embodiment of diverse genders

Another question was to explore how the interviewees experienced and expressed their gender identity in social VR. We asked them about their gender identity in real life and how it matched or differed from their virtual avatar appearance and gender identification. We also inquired about their feelings of embodiment and agency in social VR, and how they were perceived and interacted with by other users. Through this set of questions, we expected to gain insights into the challenges and opportunities of fundamentally free and fluid gender expression and embodiment in social VR as well as if and how the according experience relates to their real-life identity and gender expression. 

Interviewee 1 presents as gender fluid in real life and VR: 

“I am 22. I am gender fluid. (in real life) […] I tend to make myself just as kind of androgynous as possible. Most of my personal avatars I typically go with the male variant but or sometimes I will use like a female with a smaller chest.”

They describe the sound of a user’s voice as the main element of identification of gender by other users in VR: “Of course some people still do use she/ her here because I do have a more feminine sounding voice as well.” And state that they were respected in their choice of diverse pronouns most times: 

“But I have run into a number of times where I’ve said hey, you know, like please use like they, them or even sometimes he/ him pronouns for me. And a good number of times they will be like, yeah, of course.”

They also report that similarly to reality some people do not respect their chosen pronouns: “Of course you’ve run into some people that don’t respect that, but you’ll run into that anywhere, unfortunately.”

Interviewee 2 identifies as male and presents as female in virtual reality while using he/him pronouns and presenting as male in real life: 

“I’m a he/ him. I’m a dude 100%, always been a man. I’ve never been questioning my own gender. I just like being a cute anime girl in VRChat, OK? Yes, my choice of avatar does not represent my gender and all right, it’s because guys want to be cute too, without being a woman.”

Interviewee 3 states that their pronouns and gender identification in VR changes with the Avatar they wear and the way they present in VR: “In VR I usually go with the avatar, so when I am a female, I prefer people use she. When I am a male, I prefer he, and when I am an object so like a lamp or stuff like that, it doesn’t matter.”

This differs from their real life where they represent their birth gender and use the related pronouns only: “No, it’s something I just used to do in VRChat – in real life I am not into this stuff and people call me for what is in.” (I 3) 

Interviewee 4 presents as the same gender in VR as in reality: “I go by she/ her, and so that’s the same in VR as in real life.”

Experienced harassment or hate speech

Harassment – especially sexual harassment – as well as anti-Semitic, racist, transphobic hate comments or extremist content not only have harmful individual and societal consequences, which are well researched for social media, but can also lead people to avoid certain virtual platforms and spaces. Protecting users from harassment and creating an inclusive and dignity-oriented climate is also particularly important in immersive virtual environments, especially since it must be assumed that the intense experiences through embodiment and virtual reality are experienced more intensively by users than in text-based media. No reliable studies or figures are yet available on the spread of hate messages in immersive environments, partly because the platform providers (so far) – unlike the particularly large social media platforms – are not obliged to provide public and transparent information on measures against hate. Due to real-time presence, real-time verbal communication and the absence of recordings, the recording of hate messages in social VR is particularly difficult. In the interviews, respondents reported varying degrees of experience with harassment or hate speech in VR.

Interviewee 2 observes “[…] definitely a lot of racism in this game, like in a lot of unfiltered language that is very racist or just offensive to people and genders and all that stuff. It’s very offensive.” This can be encountered by users in very normal situations. In addition to racist discrimination, gender discrimination in particular is reported:

“[…] People who come out, especially online […], [saying] ‘I’m transgender or I’m into my own gender stuff,’ […]definitely don’t deserve to have random slurs thrown at them.” (I2)

Influenced by political discourse in the world, the person observes mostly hate-filled messages against the LGTBIQ community: “Hateful comments and stuff towards the entirety of LGBTQ [community]. […] Because it’s become pretty prominent outside of VR Chat as well […].” (I 2)

Interviewee 1 makes direct comparisons to social media and therefore questions whether social VR should be understood as a game or a social medium: “VR chat is more of a social media that has games in it […] It’s a social app. It´s not a game. And so unfortunately, sometimes you will stumble into the wrong groups.” (I 1) 

In social VR, too, harassment and hatred can be found primarily in certain groups or spaces. The manner or characteristics to which the experienced hate messages refer are closely related to the virtual appearance as an avatar, so that the user is attacked for her avatar. However, she relates this to personal identity and does not say “[…] my avatar was attacked for being a furry’ but “I’ve been hated on for being a furry” (I 1). This speaks to a great deal of identification with the avatar and that attacks are perceived personally, rather than as avatar related. In addition, there is gender-based discrimination, “I’ve been hated on for being a female.” (I 1) and the association of both characteristics with weight-based discrimination: “I’ve been hated on for being a female furry because apparently that immediately makes you 500 pounds.” (I 1). In such situations, the user makes short work of it: “It’s just the block button and then you don’t need to talk to those people anymore.” (I 1). Interviewee 3 also reports about people “who hate the Furry community” (I 3). This raises the question of whether avatars and sub-communities in virtual realities give rise to new forms of digital hate communities that are directed against certain classes of avatar identities.

Interviewee 2 states, “I don’t think there’s really any counter speech, it’s just a straighter blocking.” The person observes different levels of sensitivity in dealing with hate speech and in using the blocking function:

“It really depends on the person who’s on the receiving end of the offensive speech […] Personally, if anyone’s doing hate speech around me, I’m the person to just block because I can’t be bothered dealing with it.” (I 2)

The interviewee 2 makes a direct comparison to social media, noting that in VR chat there is “[…] definitely not as much censorship in chat as there is on social media […] But via chat, it’s probably the worst. The difference is that you have to be unlucky enough to end up in the same room as those people, whereas with social media, everyone has access to it.”

Subsequently, the technical barrier to accessing social VR still provides some protection from being flooded with hate messages compared to old social media. At the same time, the person points out that such dark content has a more intense effect in social VR and requires more active action: “It’s just like I feel like via chat is worse in person […] But I’d say via chat is way more in your face about and then social media is.” (I 2)

Interviewee 1 compares VRChat to social media, stating that it functions more like a social network than a game. They highlight that, like social media, VRChat has different groups and communities, some of which can be hostile.

Interviewee 2 reports widespread racism, offensive language, and hate speech targeting gender and sexual orientation. They note that these issues are encountered in everyday social interactions within VRChat.

Overall, the interviewees agree that while social media and VRChat share some similarities in terms of harassment, the immersive nature of VRChat makes negative interactions feel more personal and intrusive, requiring more proactive measures to manage and mitigate.

Experiences of harassment are reported by user 1 mainly in public spaces, which she therefore avoids and meets mainly with her own circle of friends in VR: “Harassment typically comes in whenever I go to public worlds […] that’s one of the main reasons why I don’t go to public worlds anymore.” (I 1)

Most notably, the interviewee experiences the harassment in verbal form because the user has hidden other avatars by default in the settings: “It’s mostly speech, especially because I personally have everyone’s avatars turned off by default, and then I have to manually go in and show people’s avatars.” (I 1) However, there are also particular glitches related to the specific implementation of the embodiment in the metaverse. For example, offensive images can be used as avatars or high-resolution avatars can cause the hardware to crash due to overload. The user protects herself against this by disabling avatars by default:

“I used to be close friends with a guy who streamed UM music on VR chat […] He played live guitar, and it was mostly well-received. But there were disturbing avatars, like gore-themed PNGs, and crashers with particle effects that could crash even powerful computers. That’s why I keep avatars off by default and only enable them after ensuring someone seems decent.” (I 1)

In addition, there were situations where the interviewee, along with a friend, made recordings in VR for Twitch streams

“[…] Almost every stream had someone trying to crash it with a malicious avatar or spamming the N-word because they knew it was streamed on Twitch. I blocked people daily and used a shader-equipped avatar to see hidden users through walls, allowing us to block them via the menu.” (I 1)

The statements reveal the high technical level of the early VR experts, who developed various tricks to deal with unintended, sometimes malicious, interventions in the environments. This addresses key technological challenges to the goal of photorealistic real-time presences or highly authentic avatars: The technical overload of consumer hardware is predominantly not (yet) capable of this. The varying quality of technological equipment can lead to this being used to the disadvantage of people – consciously or unconsciously – with less powerful hardware – a new form of digital divide in immersive environments. For the development of innovative technologies on the one hand and the establishment of interoperability on the other, this represents a major challenge that is directly relevant to users’ sense of security and stability in immersive virtual environments.

Interviewee 3, on the other hand, reported barely noticing any harassment or hate speech. Only once had she been told to “kill me” (I 3). For the interviewee, harassment could mainly come from people close to her; she had not experienced any harassment with strangers: “No, I haven’t experienced that from strangers. They’re all nice, even in public […] I feel like I see most people just talking to each other like that. Yeah, I don’t get harassed that much.” (I 3) 

 Interviewee 4 reported a lot of gender-based harassment in Social VR: 

“Mostly just people being like as soon as I speak because I’m a female, they’ll be like, my God, it’s a female here. What is she doing? She should be like back in the kitchen or something like that […].”

In addition to language, the avatar is also used for harassment: 

“But mostly kids that usually come up and are like, oh my God, you want to see a trick and either pull out like something offensive on an avatar. Again, this is mainly a problem in public spaces. In VR, however, harassment is not as hurtful as in real life.” (I 4)

The interviewees reported varied experiences with harassment and hate speech in VRChat, highlighting significant challenges and the ways users cope with these issues. The survey reveals that harassment in VRChat is more prevalent in public spaces, prompting users to seek refuge in private groups. Verbal harassment and technical exploits are common, and users often need advanced technical skills to protect themselves. The digital divide exacerbates the issue, with less powerful hardware leaving users vulnerable. While some users report minimal harassment, gender-based harassment remains a significant problem, particularly in public spaces

Experienced political extremism 

When asked about extremist or hateful political statements in VR, the interviewees hardly made any corresponding observations. While Interviewee 1 assumes that there is, because “[…] everyone has their own groups and often, from what I’ve witnessed, it stays out of the public world. Or if it’s in a public world, it gets shut down very quickly if it reaches an extremely hateful level.” . But personally, the person “didn’t get involved in anything extremely political” (I 1). However, the interviewee did notice individual statements in favor of Donald Trump that disturbed his need for calm and relaxation:

“I’ve run into people who have tried to say, Donald Trump 2024, build the wall. And I want to say that in most cases that I’ve seen, they were just bad memes. And I end up blocking those people, too, because I’m here to relax, not to think about politics.” (I 1)

Respondent 2 also argues similarly, staying away from politics in VR: 

“I never played through chat to bring political views into my life, so I stayed away from anything political, OK?” While things like the Russian war on Ukraine are talked about, “Again, Ukraine, Russia, but inflation is a pretty big deal and how it affects people in different countries, things like that,” but “it was never anything super political.” (I 2)

Interviewee 3 also initially denied political content but then stated that he had contact with users in VR “[…] who are Nazis for no reason. So, the classic, yes, the classic stupid people who bring back Nazis and fascists like that. People who hate for no reason.” 

Interviewee 4 also reported “Trump supporters or like fascist or extremist behavior, that kind of thing”, but not “too much of it”. Political discourse can be observed in VR by bystanders: “I’ve met people talking about it in public and they’ve been arguing amongst themselves and I’m standing here listening to it just to hear their sides”. Regarding the feeling of safety in social VR, the person describes Ripper in particular as a danger and complains that the platform does not respond enough to users’ concerns and problems.

In summary, it can be said that all respondents had corresponding experiences in connection with harassment and hate speech, albeit to varying degrees of intensity. Gender-related devaluations were mentioned particularly frequently. Verbal statements are in the foreground, but destructive uses of VR technologies are also experienced as harassment.  The central way to deal with this is to block those who discriminate or harass. Measures such as reporting accounts or even criminal consequences for haters were not mentioned. The social consequences of blocking statements and behaviors perceived as offensive or discriminatory for social relationships in social VR and for social behavior in the physical world therefore require special attention in future research.

Current opportunities of protection

We asked interviewees how they utilized and evaluated the protection mechanisms that VRChat currently offers to its users. We asked them, among other things, about their awareness and usage of the VRChat Safety and Trust System, which allows users to customize their safety settings, block or mute other users, report abusive behavior, and create secure instances. Through these questions we are aiming to gain insights into the effectiveness and limitations of the protection mechanisms in VRChat and what they mean for people’s abilities around social interaction in VR. 

Interviewees responded that they tend to mute and block other users on a daily basis. The people on their blocklist are mostly strangers. 

Interviewee 1 explains the process of the vote kick functionality on the platform VRChat: 

“So how it works is anybody is able to put in a vote kick for someone who is being loud, being, harassing, whatever the reason may be […]. You can click on the person and do vote kick and then it’ll pop up on everybody else’s screen saying a vote kick has been started against player name. Do you agree? And you can select yes, or no?” 

The Interviewee also describes how the feature is not always working as intended: 

“But I have seen multiple instances where everyone, well most everyone would click yes and agree for this person to be removed from the instance. And the vote would go through, but the person would still remain in there. This has been a VR chat issue for as long as I’ve been playing.” (I 1) 

The Interviewee 1 mentioned how muting and blocking is a prevention of future incidents but does not actually prevent users from being exposed to initial hate or discrimination:

“[…] that’s a tool to help prevent future from that person, but it’s not gonna fully stop it before it happens. You are going to probably hear some sort of comment or see some sort of thing before you block them. That’s why I said the only foolproof way is to just not go to public worlds.”

The frequency in which Interviewee 1 is using the blocking and muting feature depends on the situations and environments they are interacting in: 

“[…] In situations where I was like back with my friend who was recording, it got to the point to where almost every stream there would be somebody who would either log in with a crasher avatar and try to crash him, or would think it was the funniest thing to just say the N word over and over again knowing that he was streaming on Twitch. So back then I used to block pretty much daily, and I actually have an avatar that has a shader on it to where you can see people’s avatars, even through walls.”

Interviewee 2 states they stopped going to public worlds to avoid needing to block or mute people: “I either just mute or block people right away. If I can’t be bothered with really hateful speech, that’s what I would do, at least to strangers. If it’s with friends and stuff, then I just tell them to shut up.”

Interviewee 2 describes that they usually feel safe but ‘Internet trolls’ are the main reason for them to feel unsafe on the social platform VRChat: 

“I don’t think I really feel unsafe in VRChat. I mean. There’s obviously a lot of people who are very racist specifically and offensive people that aren’t always fun to be around and listen to. […] Internet trolls are thing even in VRChat. So that’s definitely something I feel unsafe about […].”

Interview 3 mentioned that they generally feel safe even if they get harassed, but they are afraid of hackers that might potentially steal their personal data: “[…] people who like harass or stuff like that, I don’t mind too much. I can block them if they are too much, but the only things I don’t feel safe with are stalkers and hackers […].” 

The Interviewee describes further that they first mute the user in question before actually blocking them: “When they get too much, I block them, but usually I just mute the people but if I see they keep going even when I say stop, I’m gonna block them. So, the fact I could block a more easier person is something I like. […]. I try to understand why this person do this stuff because the other person know what he’s doing, what they doing. But if I see there is not feedback, I just block him.” (I 3)

Interviewee 4 also describes that they first mute the user that is showing inappropriate behavior before blocking them: 

“[…] if it gets too extreme, sometimes I will just block or mute the user. Sometimes I’ll mute them first, but then if they’re really trying to be annoying, even after them being muted to where I can’t hear them, I’ll just block their avatar entirely.”

The Interviewee describes how they go to their own world and only invite selected friends if they feel harassed or bothered by a user: 

“And if they’re trying to like, follow me through another friend or so, since you can join off friends in other worlds, I’ll sometimes just go into my own like private world and then invite the few friends that I want.” (I 4)

Overall, while VRChat offers various protection mechanisms, their effectiveness is limited. Users adopt personal strategies, such as avoiding public spaces and using advanced blocking techniques, to enhance their safety and ensure a more positive experience in virtual environments.

Further protection suggestions and challenges

Following the previous questions about the current protection methods and how our participants use them, we inquired about the interviewee’s personal suggestions for new or improved safety features in social VR, based on their experience and insights. 

Interviewee 1 describes how they like the approach and tools provided by the social platform VRChat but states that the current protection systems are still buggy and need improvement: 

“[…] VRChat […] should fix their vote kick thing. […] what VRChat has done to be able to make it so easy to control, like who you’re seeing, what you’re seeing, and being able to block people if needed is super helpful […].”

Interview 2 is thinking about censorship of spoken word compared to text and how censorship could affect the way people socialize in VR: 

“I feel like it’s both good and bad that there’s not a lot of censorship. […] You would need to do voice recognition on everything. Probably that would be a big decision. That would probably make the game pretty unplayable for a lot of people because that amount of information your game suddenly needs to process.”

The Interviewee continues to point out that the existing moderation features of VRChat are either not working or not yet providing the desired outcome:

“[…] VRChat has never been really good at what would it get modern, moderate the game itself […] Getting reported, that would definitely help the censorship and VRChat and this general moderating people.” (I 2) 

Interviewee 2 also suggests a ranking and trust system: 

“[…] it’s basically like a kind of record of your behavior in a way that you’re suggesting where you can see okay this person, I didn’t know them before. And so, they come in the room and then you’re going to basically see how often they have been reported […].”

As well as an ID verification process, primarily to keep interactions with minors safe: “[…] as an example, if you’re below the age of 18, you should have a way to check box then. So that if you’re in a space within with somebody minor, there is no way to hide.” (I 2) 

Interviewee 3 is suggesting a warning system to support awareness of users that are on their blocklist: “[…] you want to get warned if a person that you have blocked is in a certain world, so you want to kind of have a warning about that, that you don’t join the same instance.”

Interview 4 is pointing out concerns around anonymity as well as personal data and asset protection: 

“I wish there was another way besides using a VPN changer or something, but something to change your IP address. Because there’s a lot of people in VR who can find that information about you and take that information, know your home address and everything else about you. And that is like really uncomfortable, and I wish people didn’t know how to do that […].”

Blocking as a personal and structural security strategy

We specifically looked at how the interviewees used and perceived the blocking system in VRChat. We asked them about the number and nature of people they had blocked, the reasons and outcomes of blocking, and the challenges and drawbacks of blocking. Through this further investigation of this popular, but profoundly different to real life, protection mechanic we aim to gain insights into the role and impact of blocking as a personal and structural security strategy in social VR. 

Interviewee 1 describes how the blocking feature affects the visibility of avatars of users: 

“The biggest thing is that if you block someone, it blocks it for you, but not for everyone else in the instance. So, I’ve run into things where I blocked someone and then like the poor person standing next to me didn’t want to block them for whatever reason. And then they started harassing the person next to me trying to get them to talk to me […].”

And about their view of potentially getting blocked by other users: “I mean, I don’t know if I’ve ever been blocked. It doesn’t tell you like when you do. Honestly, it doesn’t make a difference to me.”

Interviewee 1 goes on to describe who and why they block other users: 

“I don’t know how many are on my block list and I’m also not sure how to check. I can say probably like 2 to 300. […] most of the people on the block list are people who came up immediately saying slurs or people with crashed your avatars or just small children that were being harassing and should not be on VR.”

Interviewee 2 states the frequent presence of ‘trolls’: 

“I know there was a way to see your full block list and your full mute list, but I don’t think that I don’t think that API is running anymore. […] but it’s definitely over 500. They’re usually strangers who are being toxic in public spaces. […] or just straight up trolls.”

Interviewee 4 has comparatively few users on their blocklist: 

“I try not to block as many people. So, 22, yeah. […] probably one of them was someone I knew before, but most of the rest of them are strangers. It’s people just coming up to you, screaming in their mic or doing something irritating like and just really trying to get at you. It’s just random people.”

The Interviewee also describes negative consequences that arise from the blocking feature: 

“[…] seeing it from another group standpoint and seeing a friend of a friend block somebody, there is usually an issue with that. Because then other people in the group who hang out with that person that’s blocked, they’ll be like, well, we can’t hang out and do events or group activities because you have them blocked. […] maybe that person is trying to like contact you in another way to try to get them unblocked and be like, hey, add me back, blah blah blah. […] I’ll try to give them a warning, but honestly, if they’re not listening to me, it’s an immediate block.” (I 4)

Interview 2 sees blocking in an unemotional way: 

“I have been blocked before and it’s never really been something that I cared about to be honest, it hasn’t really benefited me or affected me whatsoever […] and adds that people use the blocking feature merely to have a better view at the mirror in the virtual environment […] cause a lot of people will block you because that’s a mirror, right? This guys in the way just to block him and sit where he sat stuff like that happens a lot in public spaces.” (I 2)

Interview 3 describes how they got blocked by their ex-partner and that there was no confusion or irritation around this: “No, Like I say, I don’t get blocked, but I get gossiped. […] But get blocked maybe one time from my ex, Yeah, and. But there was nothing like specifically weird about that situation that I you could remember.”

Interview 4 reports about their experiences with blocking as something that can be confusing and even hurtful: 

“So, I have gotten blocked before [..] for absolutely no reason at all. I know I didn’t do anything or say anything hurtful. […] And when I get blocked, if it’s from a random person […] sometimes I’ll just laugh it off and I’ll be like, OK, whatever. They have their reason. Now, if it’s someone that’s a friend who has blocked me before, I’ll feel kind of hurt and I’ll be like, well, what did I do? What did I do to hurt you? What can I do to make this better?”

Overall, the Interviewees describe blocking is a common tool for managing harassment in VRChat, but its effectiveness is limited to the user who initiates the block. The feature can cause social complications and emotional reactions, depending on the context and relationship between users.

Potential legal consequences of actions in social VR

An important point of our study was to examine the legal implications of actions in social VR and specifically on VRChat. This is a difficult topic as anonymity is still a very high priority for a majority of users and currently there is no social VR platform with a legal identity check. We asked them about their awareness and opinions on the existing and emerging laws that could apply to social VR platforms to gain insights into the challenges and opportunities of legal governance in social VR. We also inquired about their views on the ethical and moral concerns that arise from social VR such as identity, impersonation, consent and harassment. 

Interviewee 1 suggests that there should be certain crimes that are relevant to be prosecuted in the metaverse and ruled out at the user’s physical location: 

“I believe that it would depend on what criminal activity I would also for example, say you have a restraining order against someone and that someone uses VR to try to bypass that restraining order. I do think that that should be a violation, and they should be charged via whatever court gave that restraining order.” (I 1) 

They make clear that theft of Avatars and their 3D files is a major issue that is not prosecuted currently: “Now when it comes to the stealing avatars […] that is a breach of copyright. You are stealing somebody’s intellectual property.” (I 1) 

Interviewee 2 sees the safety of minors and presence of pedophiles on the social VR platform VRChat as the most pressing issues to be tackled on platform and policy side: 

“Yes, there are so many pedophiles in this game it is actually ridiculous. […] And they will kind of prowl upon anyone who has some sort of need or wanting for attention, regardless of age. And there are a lot of really social, angsty teenagers in this game aged like 13 to 17, […] some of these guys or girls are really like just sinking their teeth into because they can give them attention that they need in exchange for […] certain types of like child pornography and like abuse towards minors happening on the platform.”

Interviewee 3 suggests that crimes in VR should be prosecuted as well as security recordings of the virtual environments while users interact: “That’s if he’s up and online, that doesn’t mean he’s not a crime. […] There is need a video of the world or two hours of the world. So, like a security Cam basically.”

Interviewee 4 mentions the responsibility of parents to watch the media consumption of their kids and suggests that crimes involving drugs and abuse of minors are most urgent to be addressed: 

“[…] pedophiles being on the game and going to young children and being like, hey, well, your parents aren’t watching you playing this game […]. But there’ll be people in this meeting like, well, your mom or dad isn’t watching you. You have this headset on. I’ll be your new mom or dad. And I’ll take care of you, and I’ll buy you whatever you want. You just have to listen to me. And they will do that. […] And that’s how kids fall into their trap.”

The interviewees highlighting issues such as harassment, theft of avatars, and the presence of pedophiles on VRChat. They emphasize the need for legal accountability for criminal activities in VR and stress the importance of protecting minors and managing intellectual property rights within the platform.

What should policy makers, regulators or designers know about VR experiences like VRChat?

The final question of our study was to inquire about recommendations and insights for policy makers, regulators or designers who are interested in or responsible for social vr experiences like VRChat. We asked participants about their expectations and suggestions for improving the quality, safety and accessibility of social VR platforms. 

Interviewee 1 emphasizing the importance of the blocking feature which is seen as most helpful to prevent continued harassment: “[…] I use the avatar blocked by default just in case if somebody has an avatar that I really don’t want to see. Um. And I think if more VR applications implemented something like that as well, that would just be huge.” (I 1)

Interviewee 2 warns about the safety of minors on social VR platforms as well as the safety of adults mistakenly interacting with minors: “But there are a lot of pedophiles playing this game, sadly. And that I feel like needs to be prosecuted by the police. […] There should be some kind of punishment, but it’s next to impossible to actually prosecute people online.”

Interview 3 refers to the previously bespoken security recording as well as ID verification to enable prosecution for online crimes: “„Because I think we’re definitely a lot of people are looking at is something like ID verification […] if the metaverse is going to be a real second kind of world […] It’s probably also a matter of like, does this person need to verify themselves, right?“

They further point out the importance of no cost associated with ID verification due to access and inclusion of financially weaker countries and individuals: “[…] Maybe there is someone who is really poor and they tried to escape from their daily life and got to have some fun. If you put another thing so this person has to pay, I think it’s not good. So, like a verification. A free verification.” ( I 3) 

Interviewee 4 points out the importance of policy makers and regulators having a first-hand experience with social VR platforms as a basis to make more informed decisions as well as staying updated on the needs and experiences of long-term users: “ […] a lot of them, when they hear about VRChat, they don’t know anything about it. […] I just wish that they can at least have a little bit of you and put the headset on, so they know what’s going on in the world and understand it a bit more before making their final decisions.”

They further argue that the level of immersion heightens the urgency for regulations and safety on social VR platforms: “I feel like they when I’m putting on this headset, it isn’t just a game. At the end of the day, there’s people behind the headset […] who have actual feelings and thoughts.” (I 4) 

Interviewee 4 also urges that unaccompanied time in the headset is risky for children and should be monitored by adults: 

“[…] I want more people to know that parents and just whatever adult is watching your child needs to have better like moderation over the headset. […] Because they’re the age limit for VR and there’s kids younger than that are entering this game […]. And I really blame the parents for that.”

Interviewees in the study recommend policy makers and platform designers enhance safety features in social VR, such as implementing effective vote kick systems, improving minor protection, and considering ID verification for user accountability. They emphasize the need for first-hand experience by regulators to better understand the platform’s dynamics and stress the importance of making security measures accessible to all users, regardless of their financial situation. Additionally, they highlight the necessity of parental supervision to safeguard children from inappropriate content and interactions.

Additional comments from Interviewees

In conclusion Interviewee 1 comes back to the unique possibilities of VR to personalize the user experience and enable social interactions for individuals that have difficulties socializing in real life: “[…] I am an introvert […] I started streaming around 2020 […] those things have really helped me socialize more. I think it’s usually because of the control aspect of it. […] And if force comes to absolute worse and I get too overwhelmed, I can just leave.”

Interviewee 2 also emphasizes his experience of practicing social interaction in a safe space: “Social interaction was my reason for getting in VR, yes. But I feel like I’ve grown enough as a person since starting the game, starting VRChat and to interact with real people as well without being too insecure about myself or being super angsty.”

Interviewee 4 points out that the age rating for social VR spaces should be higher as the technology is not comparable to other games and the degree of immersion could be a safety risk for minors: 

“Every game is so harmful to the mind and body or whatever. […] The only way you’re going to understand [this game] is by the voice of the people and by experiencing yourself how immersive such a VR environment and VR situations in VR can be, right? […] you can buy suits to feel touch in VR […] like you can touch people over the Internet. You really want to have like a more or less regulated metaverse where you can try these things in a safe context at least. It should at least be 18 plus, if not even 21.” (I 4)

The Interviewees highlight the unique benefits of VR for improving social interaction skills, especially for introverts, due to its controlled environment and customizable features. They also stress the need for higher age ratings for social VR spaces due to the heightened immersion and potential risks, suggesting that regulations should adapt to the advanced nature of VR technology.

Discussion of the results

The study is limited by the relatively small sample size of only four participants. Furthermore, the sample is restricted in terms of age groups and genders, with representation from male, female, and diverse gender identities. Geographically, the participants are predominantly from the United States and Europe, specifically from Missouri, Texas, Denmark, and Italy, which limits the generalizability of the findings.

Summary of key findings

Experiencing and expressing personalities unlinked from their material bodies serves as an opportunity to experiment with and learn about social interaction. Friendships, Relationships and Smalltalk with strangers are embraced in VR for various reasons like users’ location or social anxiety. 

Anonymity in online spaces has significant positive and negative impacts on people’s behavior. 

Mechanics of protection are still in development, buggy and not the main priorities for platforms. 

The most pressing area of improvement is the safety of minors through identity verification of users. 

Relevant concerns include limited knowledge from policy, law and decision makers of the various immersive aspects of social VR. Users emphasize the incomparable immersive nature of VR and the resulting social aspects like friendships, relationships and personal development of interacting in digital environments for large amounts of time. 

Opportunities and challenges of social VR

Opportunities: Socializing with likeminded people worldwide. Logistics and environmental impact are mostly independent from users’ location. Learning and collaboration are accelerated through high levels of immersion. Creative expression through interactive tools as well as relevant audiences are widely accessible.

Challenges: Fraud and harassment by incognito users are widespread and frequent in online social spaces. The immersive quality of VR makes harassment particularly dangerous. Anonymity and missing age verification enable age-inappropriate behavior and abuse of minors. 

Subsequent research questions

Socializing in VR compared to material reality (material as in our regular reality) and the effects on users of different age groups. 

The surveys clearly showed that experiencing and expressing personalities unlinked from their material bodies in VR offers users the chance to experiment with and learn about social interaction. Many embrace friendships, relationships, and small talk with strangers due to factors such as their location or social anxiety. While anonymity in online spaces has significant positive and negative impacts on behavior, protective mechanics are still under development, often buggy, and not prioritized by platforms. The most pressing improvement needed is the safety of minors through identity verification. Policy, law, and decision makers often lack comprehensive knowledge of the immersive aspects of social VR, which users emphasize as having an unparalleled immersive nature that fosters friendships, relationships, and personal development through extensive interaction in digital environments.

Social VR presents opportunities such as socializing with like-minded people worldwide, logistical and environmental independence from users’ locations, accelerated learning and collaboration through high levels of immersion, and creative expression through interactive tools and accessible audiences. However, it also faces challenges including widespread fraud and harassment by incognito users, the particularly dangerous nature of harassment due to VR’s immersive quality, and the risks posed by anonymity and lack of age verification, which enable age-inappropriate behavior and abuse of minors. Subsequent research could explore socializing in VR compared to material reality and its effects on users across different age groups.

References

  • Banakou, D., Groten, R., & Slater, M. (2013). Illusory ownership of a virtual child body causes overestimation of object sizes and implicit attitude changes. Proceedings of the National Academy of Sciences, 110(31), 12846–12851. https://doi.org/10.1073/pnas.1306779110
  • Deighan, M. T., Ayobi, A., & O’Kane, A. A. (2023). Social virtual reality as a mental health tool: How people use VRChat to support social connectedness and wellbeing. Proceedings of the ACM CHI Conference on Human Factors in Computing Systems.
  • Hinduja, S., & Patchin, J. W. (2024). Metaverse risks and harms among US youth: Experiences, gender differences, and prevention and response measures. New Media & Society, 26(1), 1–22. https://doi.org/10.1177/14614448241284413
  • LaValle, S. M. (2020). Virtual reality. Cambridge University Press.
  • Hinduja, Sameer; Patchin, Justin W. (2024): Metaverse risks and harms among US youth: Experiences, gender differences, and prevention and response measures. In: New Media & Society, 1–22. DOI: 10.1177/14614448241284413.
  • LaVelle, Steven (2020): Virtual Reality. Cambridge University Press. 
  • Quent, M. (2023). Demokratische Kultur und das nächste Internet: Chancen und Risiken virtueller immersiver Erfahrungsräume im Metaverse. In Institut für Demokratie und Zivilgesellschaft (Ed.), Wissen schafft Demokratie: Schwerpunkt Netzkulturen und Plattformpolitiken (Vol. 14, pp. 30–43). Jena, Germany.
  • Sabri, N., Chen, B., Teoh, A., Dow, S. P., Vaccaro, K., & ElSherief, M. (2023). Challenges of moderating social virtual reality. Proceedings of the ACM CHI Conference on Human Factors in Computing Systems.
  • Stockselius, C. (2023). Social interaction in virtual reality: Users’ experience of social interaction in the game VRChat (Master’s thesis, University of Skövde). University of Skövde.
  • Vogelgesang, A. (2024). Social VR Platform Design, User Creativity and Aesthetic Governance. Project Immersive Democracy, https://metaverse-forschung.de/en/2024/02/15/social-vr-platform-design-user-creativity-and-aesthetic-governance/.

Unpixelated Hate? Lessons from VR Gaming for a Digital Civil Society

Mick Prinz
B.A. in Social Sciences, is the head of the model project “Good Gaming – Well Played Democracy” at the Amadeu Antonio Foundation. Together with game developers, e-athletes, and influencers, he campaigns for a stronger digital civil society in gaming culture. His research focuses on toxic and right-wing extremist gaming communities, user-generated content on gaming platforms, and propaganda video games.

Video game worlds are immersive realities. Depending on the genre and type of game, a wide variety of narratives are explored, scenarios are opened up, and values are conveyed. Gaming is much more than an escapist immersion into a digital Wild West, medieval villages, or futuristic space battles. Within different gaming spaces, political debates are held, and stereotypes are either rejected or reproduced. In toxic niches of gaming worlds, anti-democratic actors attempt to apply meta-political strategies and normalize racist, anti-Semitic, or misogynistic narratives. However, numerous movements advocating for a pluralistic understanding of values also coexist within virtual realities. The video game industry has been testing grounds for these dynamics since the 1980s.What experiences can be derived from gaming worlds that are like the metaverse structure? How do toxic and right-wing extremist actors attempt to instrumentalise immersive video game spaces, and what threats and potential for a digital civil society does virtual reality (VR) gaming harbour? 

INTRODUCTON

In Germany, over half of 6-69-year-olds play computer and video games.1https://www.game.de/wp-content/uploads/2023/08/230809GME_Jahresreport_2023_168x240_DE_Web.pdf (10.12.2023)While many game titles are increasingly breaking stereotypes, some video games incorporate anti-democratic narratives and promote discrimination in their plots and character design. It is not just the game settings that have political implications, but also the gaming communities that are becoming increasingly political. Discussions on topics such as Black Lives Matter2https://www.gamestar.de/artikel/live-stream-mit-der-chefredaktion-blacklivesmatter-und-der-einfluss-auf-die-gaming-branche,3358410.html (10.12.2023), the invasion of Ukraine, or the situation in the Middle East are also taking place in in-game chats and on gaming platforms. These debates are often characterized by toxic arguments and discrimination against marginalized positions – a climate that anti-democratic actors exploit. Gaming spaces with poor moderation are used to spread misanthropic narratives, intimidate people, and strive for international networking. Players experience toxic agitation on gaming platforms such as Twitch, Steam, or Roblox, as well as in in-game spaces, voice or text chats, and virtual realities. The intensity of hate and discrimination varies within immersive gaming realities.

In this article’s first step, the nature of video game culture will be examined, and a definition of immersive gaming realities will be provided. Various examples will be outlined, and the cornerstones of VR gaming will be traced. The topic of dark participation and toxicity in video game contexts will then take center stage. Different levels of discrimination will be outlined, and examples of reactionary agitation within different gaming communities will be discussed. The extreme right in existing immersive realities will also be the focus. Three forms of instrumentalization will be examined here: networking, mobilization, and the ‘meta-politics’ of the extreme right in video game contexts. These examples emphasize the challenges that metaverse-like structures face and the tasks they will have to deal with in the future. The text introduces the idea that immersive realities have great potential for promoting democratic values. Both titles created for entertainment purposes and serious games (games with an educational focus) are discussed here. These types of video games can effectively convey values through virtual realities. The passage concludes with a summary of the lessons and challenges for designing metaverse-like structures based on the cultural aspect of gaming.

WHAT DO WE MEAN BY IMMERSIVE REALITIES IN VIDEO GAMES?


The term „immersion“ initially refers to being fully immersed in a specific environment or setting (Murray, 1998). Digital technologies such as virtual reality, augmented reality (AR), mixed reality (XR), and the entire ecosystem of digital games are all categorized as immersive realities.3Mühlhoff & Schütz, 2019; Nilsson et al., 2016 Video games, in particular, often contain elements of immersive realities. They have a virtual environment that is partially disconnected from the physical world, but can still be based on reality. A prime example of this is the popular game Minecraft, which has over 180 million users each month in 2023.4https://www.g-portal.com/wiki/de/wie-viele-leute-spielen-minecraft/ (10.12.2023) Players can create and design entire worlds, establish ecosystems, and work towards cooperative or competitive goals. Along with countless fictional worlds, Minecraft also has servers where players can recreate real German cities, while elsewhere, digital depictions of Nobel Peace Prize winners educate players on basic democratic values.5Diedrich, Gamestar.de, 2022 https://www.gamestar.de/artikel/minecraft-nobel-dalai-lama,3378349.html (10.12.2023) Additionally, the project „Reporters Without Borders“ created the „Uncensored Library” in a digital Minecraft environment to showcase works that are banned in autocratic regimes. Thanks to the free accessibility of Minecraft’s immersive reality, progressive literature remains accessible even in dictatorships.6https://www.uncensoredlibrary.com/de (10.12.2023)

Ein Bild, das Gebäude, Symmetrie, Architektur, Im Haus enthält.

Automatisch generierte Beschreibung

Figure 1: Screenshot of the Saudi Arabia section in the “Uncensored Library”, source: Reporters Without Borders, 2023


Another important aspect of immersive realities can be seen in video games: avatarization. Players often have the opportunity to take on the role of a fictional character, create their own storyline, or follow a pre-defined narrative. Designing a unique character is also a central component, particularly in (online) role-playing games. Many modern titles implement a wide range of options for customizing your own digital image closely to real-life conditions if desired. Titles such as the space role-playing game “Starfield” allow for a free choice of pronouns, while other examples such as the fantasy role-playing game “Baldur’s Gate 3” allow players to choose their voice, gender attribution, and external characteristics independently. While many players welcome these diversity options, allowing players to correctly integrate their own identities into a game, other gamers react with reactionary rejection and toxic criticism.7Seng, FAZ, 2023, https://www.faz.net/aktuell/feuilleton/medien/pronomen-in-games-wokeness-im-weltall-19288095.html (10.12.2023) 

It is also noticeable that many online video games enable multi-layered interaction options with other entities. One example of this is the escapist simulation game „Animal Crossing: New Horizons”. A game mechanic here makes it possible to visit neighbouring islands of other players. In addition to social interaction opportunities, a visit can also be economically motivated. Different islands mean potentially different beet prices – a rare commodity for which prices vary daily and are different on different players’ islands. The focus here is less on social motives and more on the capitalist maximization of one’s own money within the game cosmos.8Zeitz, Gamestar, 2020, https://www.gamepro.de/artikel/animal-crossing-new-horizons-ruebenpreise-vorhersagen,3357127.html (10.12.2023) In addition to these obvious forms of interaction dictated by the game, Animal Crossing has also attracted attention through political protest. Players join forces and demonstrate in favor of a free Hong Kong, the anti-racist Black Lives Matter movement, or against toxic corporate structures in gaming companies.9Mack, ze.tt, 2020, https://www.zeit.de/zett/politik/2020-04/china-verbannt-animal-crossing-nach-virtuellen-hongkong-protesten (10.12.2023) 

Ein Bild, das Person, Kleidung, draußen, Himmel enthält.

Automatisch generierte Beschreibung

Figure 2: Black Lives Matter protest in The Sims 4, source: BlackSimsMatter, 2020


Digital protests with references to real-life conflicts take place in immersive realities in many places, while other titles focus on co-operative or competitive scenarios. Potential characteristics of an immersive gaming reality therefore include the creation of separate ecosystems, the ability to live through fictional or realistic scenarios with an avatar, shaping stories, and interacting with other players. 

GAMING AND THE CHANGING RELATIONSHIP WITH VIRTUAL REALITY

Debates about the use of VR technology are nothing new. Decades before Mark Zuckerberg and Meta publicized the idea of a metaverse-like, digital parallel world, the first attempts at VR were already being made in the game industry. In the 1980s, projects tried their hand at the first VR versions, but the VR arcade machines suitable for the masses were not created until 1991 by the company “W-Industries”.10Redaktion TechMonitor, 1991, https://techmonitor.ai/technology/w_industries_makes_virtual_reality_a_reality_at_ukp20000 (10.12.2023) Based on the Amiga 3000, W-Industries presented the one-kilogram arcade machine. Two years later, the SEGA VR was announced, which was reminiscent of today’s VR glasses for the first time.11Bezmalinovic, Mixed, 2017, https://mixed.de/sega-vr-vom-aufstieg-und-fall-der-legendaeren-vr-brillle/ (10.12.2023) However, due to immense production costs caused by expensive motion sensors and the then still unsolvable problem of motion sickness, SEGA’s VR goggles were never released. In the years that followed, there were repeated attempts to use VR-like technology in the gaming world, such as Nintendo’s “Virtual Boy” in 1995.12Norman, History of Information, 2013, https://www.historyofinformation.com/detail.php?id=3637 (10.12.2023) Ultimately, however, insufficient processors, low frame rates, high latency, high production costs, and extensive reports of eye and headaches put a stop to these first steps in VR gaming.

It was not until 2012 that the newly founded company “Oculus VR” once again attempted to integrate VR into the gaming market. The VR goggles “Oculus Rift” were financed via crowdfunding.13 Harris, 2019, The History of the Future: Oculus, Facebook, and the Revolution That Swept Virtual Reality Meta discovered the potential of the VR market and bought Oculus VR. Many other companies, such as HTC and Microsoft, signalled their interest and tried their hand at their own VR hardware. Once again, the virtual reality boom failed to materialize, with VR headsets from various manufacturers selling too poorly. This was mainly due to the lack of major game titles that would have justified a purchase for gamers. It wasn’t until “Meta Quest” and “Meta Quest 2” and games such as the rhythm lightsaber game “Beat Saber” (2018) and “Half-Life-Alyx” (2020) that VR technology achieved higher sales figures and the interest of many video gamers. However, there is currently disagreement in the gaming community. With the PlayStation VR2 released by Sony, the company is demonstrating how powerful current video game hardware is, and high-resolution titles such as “Horizon Call of the Mountain” are impressing the trade press and gaming public. However, there is still no broad catalogue of games for the VR glasses from Sony, Meta, or Apple that would anchor the technology more firmly among gamers. As of 2023, VR remains a niche in gaming culture that is perceived by the broad mass of gamers, but is far from being favored.

DARK PARTICIPATION, TOXICITY AND GAMING

Despite the current skepticism towards VR hardware, immersive realities and virtual reality have long been an integral part of gaming culture. Video gamers are usually part of various gaming communities that form around different genres, influencers, or game titles. The majority of gamers have already experienced hate in gaming spaces – whether in-game chats, on platforms, or through private messages. According to the “Hate is no Game” study by the Anti-Defamation League, 77% of over-18s have experienced hate speech in video game contexts.” Alarmingly, even 20% of adult gamers have encountered right-wing extremists within their communities. Primarily, female gamers, non-white gamers, Jewish gamers, and parts of the LGBTQIA+ community are affected by hate in gaming. These groups are defamed because they belong to specific groups (e.g. Jews or African Americans).14Kelley, ADL, 2022, https://www.adl.org/resources/report/hate-no-game-hate-and-harassment-online-games-2022 (10.12.2023) 

“Dark participation” or toxic behaviour primarily excludes marginalized groups from various fields of interaction in immersive realities. The term “dark participation” encompasses insults, hate speech, sexual harassment, discriminatory profiles/nicknames, doxxing, and the spread of disinformation. In gaming practice, there are various levels at which dark participation becomes visible. Firstly, there is the “in-game” level. Whether via text message or voice chat, almost every online video game has to deal with insulting, patronizing, and even misanthropic statements within player communication. Studios and game makers try to reduce hate speech with moderation tools and word filters, but toxic gamers adapt systems and circumvent restrictions, for example with leetspeak (i.e. mixing letters and similar-looking numbers). For example, many e-athletes and gamers report on toxic everyday life in the video game “League of Legends”15Gogoll, Taz, 2021, https://taz.de/Trainer-ueber-Diskriminierung-im-E-Sport/!5812767/ (10.12.2023) or female gamers report sexism in the voice chat of the shooter “Valorant”.16esports.com, 2021, https://www.esports.com/de/streamerinnen-rant-auf-twitter-so-gross-ist-das-sexismus-problem-in-valorant-242179 (10.12.2023) VR video games, in particular, harbor the risk of ableist, racist, or misogynistic discrimination reaching those affected unfiltered. There are examples of “Among Us VR” or VR sports games in which, for example, the N-word is uttered by an avatar in the immediate vicinity. It is not just gaming experiences, but also misanthropy that is made more immersive by VR glasses.17Youtube Channel Mayomonkey, Youtube, 2022, https://www.youtube.com/watch?v=qbuAtWkr46s (CN Racism) (10.12.2023)

Ein Bild, das Text, Screenshot, Schrift enthält.

Automatisch generierte Beschreibung

Figure 3: Toxic user review of the Horizon Forbidden West expansion on Metacritic. The narrative: Children are “homosexualized” by the game story (source: Metacritic, 2023)


Another level at which dark participation in gaming takes place is the platform level. Toxic communication is also noticeable on various gaming platforms outside of the actual game. For example, when Black Lives Matter (BLM) is discussed on Steam, a platform popular with PC gamers. In addition to solidarity with George Floyd and the BLM movement, gamers reproduce racism and spread disinformation (Amadeu Antonio Foundation, 2022).18Prinz, Amadeu Antonio Stiftung – Unverpixelter Hass—Gaming zwischen Massenphänomen und rechtsextremen Radikalisierungsraum, 2022, https://www.amadeu-antonio-stiftung.de/neue-handreichung-unverpixelter-hass-gaming-zwischen-massenphaenomen-und-rechtsextremen-radikalisierungsraum-81173/ (10.12.2023) Elsewhere, toxic gamers postulate that diversity in video games leads to a boycott and that political attitudes must be kept out of normative works.

The comments on the game “Starfield,” where players can freely choose pronouns for their avatars, are characterized by such subjective criticism.19Steam, 2023 https://store.steampowered.com/agecheck/app/1716740/ (10.12.2023) Other interactive platforms like Twitch are also not safe spaces for marginalized groups. Apart from discrimination in livestream chats, there are countless talk shows where far-right opinions are given a platform.20Belltower.news – Redaktion, 2021, https://www.belltower.news/gaming-rechtextremismus-im-livestream-was-passiert-auf-twitch-111417/ (10.12.2023)

IMMERSIVE REALITIES AND THE FAR RIGHT

Dark participation is gaining ground in various areas of the gaming world with little resistance. This is largely due to players ignoring problematic content and reporting functions being underutilized. In addition, platform operators are inconsistent and lacking in their moderation efforts. As a result, extreme right-wing actors are also active in immersive realities, alongside toxic voices. There are three ways in which the extreme right utilizes immersive realities.

The first is networking (1) within digital worlds. In the online mode of the popular action game “Grand Theft Auto V,” users can create their own avatars. On their own role-playing servers, players establish their own objectives and follow a daily routine. Within the digital world, players who follow the SHAEF conspiracy narrative also gather. In GTA Online, they even go on digital patrol. SHAEF refers to the “Supreme Headquarters Allied Expeditionary Force,” a term used by supporters of this conspiracy theory claiming that Germany is an occupied state, a “BRD GmbH” (Belltower.news, 2021).21De:hate, Belltower.news, 2021, https://www.belltower.news/shaef-verschwoerung-commander-jansen-verbreitet-trumps-willen-auf-telegram-123213/ (10.12.2023) The platform Roblox allows players to create their own mini-games/”experiences” and offer them for free download, but it also features numerous right-wing extremist worlds and experiences. On one map, players can assume the role of an SS officer and march through a fictionalized 1940s Berlin, greeting others with the Hitler salute. In other areas, players navigate through an extermination camp where the Shoah is trivialized. Digital avatars can also participate in target practice on swastika emblems. Known for its popularity among 9-12-year-olds, Roblox has over 200 million active users.22Prinz, taz, 2023, https://taz.de/!5909022/ (10.12.2023) Many of the far-right “experiences” can be found with just a few clicks (Prinz, 2023). Another title on Steam, however, is more subtle. The description states that “[game name] is a truly immersive and uncensored social experience where you can engage with like-minded people in a mutually constructive way.” At first glance, this may sound promising. However, the game allows for the expression of any political opinion in a digital environment. It incorporates symbols from the Canadian Covid-19 protest movement and the “QANON” conspiracy ideology (Steam, 2023). This use of immersive realities as alternative spaces for interaction is a tactic employed by extreme right-wing actors to build their own world of experience with like-minded individuals, without encountering resistance.

This is also connected to another motivation: gaming and immersive realities are seen as part of a cultural struggle for the right, known as “metapolitics” (2). The “new right” considers “metapolitics” as a strategy for shifting social discourse to the right by occupying pre-political spaces. In practice, this can involve the dissemination of right-wing extremist symbols and codes or the glorification of extreme right-wing actors within immersive gaming realities and on video game platforms. For instance, there are numerous profiles on Steam that use the far-right slogan “Read Siege”, inspired by the instructions for “race war” and “leaderless resistance” followed by the right-wing terrorist group “Atomwaffen Division” (Ayyadi, 2020).23Ayyadi, Belltower.news, 2020, https://www.belltower.news/siege-james-masons-anleitung-zum-rassenkrieg-fuer-die-atomwaffen-division-94551/ (10.12.2023) The perpetrators of Christchurch, Halle and Buffalo were also praised on Steam through their profile pictures and descriptions (Gensing, 2020).24Gensing, Tagesschau, 2020, https://www.tagesschau.de/investigativ/steam-christchurch-terrorismus-101.html (10.12.2023) Roblox even had its own video game worlds where players could assume the role of right-wing terrorists and re-enact these attacks. For example, one downloadable experience allows players to play as the right-wing extremist from Buffalo and replay the attack on the “Tops” supermarket.

Ein Bild, das Fahrzeug, Landfahrzeug, Himmel, draußen enthält.

Automatisch generierte Beschreibung

Figure 4: Right-wing extremist “experience” on Roblox. Here, players can re-enact the Buffalo attack (source: Roblox, 2023)


The attacks in Christchurch and Halle were available on the platform until the moderators reacted months later.25Graf/Marx, Gamestar, 2022, https://www.gamestar.de/artikel/podcast-gaming-rechtsextremismus,3387301.html (10.12.2023) The far right primarily uses gaming areas where they can create and design their own content. Sandbox-like game concepts such as “Roblox” and “Minecraft” are used here, as well as the workshop area of Steam. Modifications (mods) for existing games can be developed here. This includes mods in which players can play as the Waffen SS, a legitimate faction in “Company of Heroes”, or select Adolf Hitler as head of state in “Civilization VI”.26Steam, 2023, https://store.steampowered.com/app/289070/Sid_Meiers_Civilization_VI/?l=german (10.12.2023) Therefore, it primarily utilizes existing gaming infrastructures in which right-wing extremists push their “meta-politics”. Occasionally, neo-Nazis also try to develop their own propaganda games, such as a right-wing developer studio belonging to the “Identarian Movement. In this game, players can choose a cadre figure from the so-called “new right” and fight against right-wing enemies as Martin Sellner or Alexander “Malenki” Kleine in a dystopian 2D pixel platformer.27Jugendschutz.net, 2020, https://www.jugendschutz.net/mediathek/artikel/praxisinfo-computerspiel-heimat-defender-rebellion (10.12.2023) Developing independent VR games does not seem to be a priority for right-wing extremist groups.

The final motive for the extreme right to use immersive gaming spaces is mobilization (3). Extreme right-wing actors use gaming evenings or patriotic video game tournaments to appeal to potentially interested parties and reactivate their own in-group. Roblox’s immersive realities have been used to target young people. Young players were invited to join right-wing extremist groups and motivated to participate in joint shooting exercises (SWR Vollbild, 2023).28Youtube Channel Vollbild, Youtube, 2023, https://www.youtube.com/watch?v=-01FPD-SYsc (10.12.2023) Elsewhere, merchandise from conspiracy ideology influencers is distributed in the game “Second Life”. Player avatars act as digital billboards for disinformation. Right-wing extremist groups also appear in existing VR games like “VRChat” to spread and provoke inhumane narratives, but also to arouse interest. For example, players in Ku Klux Klan outfits have appeared in the VRChat game and given racist monologues.29Youtube Channel ScaryToaster, Youtube, 2022, https://www.youtube.com/watch?v=j6Oxzjt2oxg (CN Racism) (10.12.2023)

Ein Bild, das Screenshot, PC-Spiel, Digitales Compositing, Spielesoftware enthält.

Automatisch generierte Beschreibung
Figure 5: Screenshot from VR Chat.A person in a KKK outfit harassed other players, source: VRChat, 2022


It is clear that the far right uses various areas of immersive realities to spread racist and anti-Semitic ideological elements, (re)mobilize people, and network at the same time. Creating their own avatars, implementing gestures, and creating their own immersive realities (such as in the Roblox sandbox or Minecraft) harbors potential dangers. As a result, parts of the digital world are becoming places where those affected by discrimination, in particular, cannot move freely and safely. 

DEMOCRACY IN VR AND IMMERSIVE REALITIES?

Gaming culture highlights the challenges faced by immersive realities and metaverse-like spaces. At the same time, experiences from various video game communities emphasize that digital worlds harbor democratic potential. More and more video games convey a canon of values that emphasize plurality and tolerance. This also applies to video games that primarily serve entertainment purposes and have a commercial interest. For example, the game “The Last of Us 2” implements a queer love triangle and deals with experiences of discrimination against trans people. “Spider Man 2”, on the other hand, depicts anti-racist and diverse narratives in its setting, while modern role-playing games, such as “Baldur’s Gate 3”, break with binary gender stereotypes in the character editor. 30Hart, Gayming, 2022, https://gaymingmag.com/2022/12/baldurs-gate-3-adds-non-binary-option/ (10.12.2023) Elsewhere, the action game “Assassin’s Creed Mirage” provides insights into Islamic beliefs. So-called “serious games”, i.e. video games with educational claims, are also significantly more immersive than the static educational games of the early 2000s. “The Darkest Files”, for example, is a historical investigation and courtroom game. Players are part of Fritz Bauer’s team of public prosecutors and have to bring Nazi perpetrators to justice.31Steam, 2023, https://store.steampowered.com/app/2058730/The_Darkest_Files/ (10.12.2023) The game “Hidden Codes” by the Anne Frank Educational Centre, on the other hand, provides information about current right-wing extremist symbolism. Thanks to a classroom version, the title can be used in school lessons without much effort.32Bildungsstätte Anne Frank, 2023, https://www.hidden-codes.de/ (10.12.2023) Democratic values can thus be played and experienced.

Democratic movements are also articulated within immersive realities. For example, hundreds of players come together every year in the world of Tyria in the online role-playing game Guild Wars 2 to organize a protest march for the LGBQIA+ community. In addition to its symbolic effect, donations were also collected for initiatives that take a stand against misanthropy (Guild Wars 2 Forum, 2023).33https://en-forum.guildwars2.com/topic/129869-tyria-pride-2023/ (10.12.2023) Elsewhere, players in the worlds of “The Sims” and “Animal Crossing” show their solidarity with the Black Lives Matter movement by designing cosmetic clothing with BLM lettering and highlighting everyday racism and discrimination in digital demonstrations.34Grayson, Kotaku, 2020, https://kotaku.com/sims-players-hold-virtual-black-lives-matter-rally-1843998733 (10.12.2023) In response to the Russian invasion of Ukraine, players in the world of “The Elder Scrolls Online” took a stand and organized a march for peace for those affected in Ukraine.35Marasigan, MMOs.com, 2022, https://mmos.com/news/elder-scrolls-online-protest-russias-invasion-of-ukraine-with-peace-marches-in-cyrodiil (10.12.2023)  Democratic protests are no longer a rarity, especially in online video games where gamers create their own avatars and organize themselves into player communities.

Ein Bild, das Cartoon, Screenshot, draußen enthält.

Automatisch generierte Beschreibung

Figure 6: Pride Parade in the game Guild Wars 2; once a year, players organize a solidarity demonstration for the LGBQIA+ community, source: Guild Wars 2, 2022


Within VR worlds, on the other hand, there are only a few pilot projects that promote greater awareness of discrimination. On one hand, this is due to the limited distribution of VR headsets compared to other gaming hardware. On the other hand, it is also due to the educational and didactic challenges that immersive experiences in VR worlds present. The depiction of discriminatory and hateful statements in virtual realities can be traumatic, especially for those affected, due to the immersion. The “Debug” project therefore accompanies users in their VR experience away from the headset. Players are followed in VR by a swarm of mosquitoes that continuously utter racist stigmas. Debug conveys the omnipresence of everyday racism for those affected and the intensity of stigmatization (Belltower, 2022).36https://www.belltower.news/videospiel-gegen-rassismus-virtual-reality-ermoeglicht-einen-perspektivenwechsel-141205/ (10.12.2023) In the title “1,000 Cut Journey”, on the other hand, players experience different life cycles and perspectives of a POC protagonist. Whether during disciplinary measures in the classroom, as a teenager by the police, or as a young adult through discrimination in the workplace – racism runs through the life of the character (Gamesforchange, 2023). 37Prinz, Belltower.news. 2022, https://www.gamesforchange.org/games/1000-cut-journey/ (10.12.2023) 

Ein Bild, das draußen, Kleidung, Himmel, Gebäude enthält.

Automatisch generierte Beschreibung

Figure 7: Screenshot of the VR game “I AM A MAN”, source: Meta, 2023 


Another VR experience entitled “I Am A Man” makes the history of the American civil rights movement of African Americans tangible. The VR presentation aims to provide personal understanding of the struggles of marginalized groups. The title uses historical film footage, photographs, and voice recordings of actual participants in the civil rights movement. “I Am a Man” sees itself as an interactive documentary experience to provide a deeper awareness of anti-racist struggles (Meta, 2023).38 I Am A MAN, Meta, 2018, https://www.meta.com/de-de/experiences/pcvr/1558748774146820/ (10.12.2023)

Virtual realities cannot fully depict the suffering and pain of those affected. However, they attempt to make the perspectives of those affected accessible to those who have not yet had to deal with discrimination in privileged perspectives or who simply did not want to. They can also generate a sense of closeness and tangibility to historical events through the integration of historical sources and make processes more tangible.

CONCLUSION AND APPEAL FOR AN IMMERSIVE DEMOCRATIC CIVIL SOCIETY

The experiences of immersive realities and VR experiences in the context of gaming culture highlight opportunities, possibilities, and challenges that future metaverse-like ecosystems will face. Extreme right-wing actors utilize various segments of immersive reality, while democratic movements also make use of these spaces. In conclusion, clear recommendations for participation in future virtual realities are formulated.

  • 1) Avatarisation: If VR spaces allow users to create and design their own avatar, diversity options must be considered. An editor that only thinks in terms of binary gender models or ignores skin or hair types excludes broad sections of society. It is also important to ensure that discriminatory and/or criminally relevant symbolism cannot be reproduced.
  • 2) Moderation of inhuman content: In addition to comprehensive reporting and notification functions, proactive moderation is necessary to identify hateful symbolism and recognize anti-democratic narratives. It is also important to make it more difficult for extremist groups to participate in their own VR. Anti-hate language programs are already being tested in video game rooms and criminal content should ideally be recognized in its early stages.
  • 3) Promote democratic participation: Immersive realities can be used to draw attention to problematic aspects in the physical world and to show solidarity with those affected. Ideally, VR spaces should provide an infrastructure that not only enables but also promotes democratic participation. For example, through moderated discussion forums, lectures, or workshops in immersive realities. Positive examples such as the “Uncensored Library” (Minecraft) should be made accessible to users.
  • 4) Protect marginalized groups/security settings: The so-called “metapolitics” and mobilization attempts in games such as Roblox or VRChat make it clear that extreme right-wing actors use immersive realities strategically. People affected by discrimination in particular, but also all other users, must be able to make comprehensive security settings and be informed about them. It should be possible to mute text and voice chats, hide gestures, and activate a security area around your own avatar. This is essential to guarantee physical proximity if desired.

To ensure democratic participation and not exclude marginalized positions, these four points are a minimum requirement for immersive realities.

The article emphasized the potential of VR spaces and digital worlds for a progressive understanding of values. At the same time, there are many pitfalls and anti-democratic endeavours that need to be considered in the potential technology of tomorrow. The problem becomes particularly clear with increasingly realistic graphics in immersive realities: if right-wing terrorist attacks, such as those in Christchurch or Halle, switch from a static block Roblox graphic to a hyper-realistic guise, the pull effect for extremist actors will increase. Developers, companies, authorities, and digital civil society must all play their part in a democratic metaverse. The lessons learned from game culture should not be forgotten.

Social VR Platform Design, User Creativity and Aesthetic Governance

 Arne Vogelgesang

is a performing artist. With the label internil and under his own name, he realises art projects that combine documentary material, new media, fiction and performance. One focus of his work is political propaganda on the internet, while an aesthetic focus is the portrayal of people under digital conditions. He is currently a doctoral candidate for artistic research at the University of Music and Performing Arts Vienna with a VR project on the body, desire and technology. 

INTRODUCTION

While the term “metaverse” is often employed to gesticulate towards a larger paradigm of digitized sociality, immersive applications of virtual reality technology (VR)1In this text, the term “VR” denotes technologically mediated immersive digital 3D environments, while the word “virtual” may in a wider sense also refer to other non-physical/online spaces, communities, practices or phenomena. , commonly represented as humanoid figures communing in partly or wholly 3D generated landscapes, are center to its vision. Social VR platforms most fully realize this vision socially and aesthetically at the moment and therefore allow for the most immediate evaluation of current practices and possible development of digital embodiment as a basis of “metaversed” online cultures. The following text is not a thorough empirical investigation of existing social VR platform culture, but an exemplary sketch of the landscape trying to delineate the conditions for and possible effects of aesthetic governance in VR.

Social VR platforms are “immersive systems that prioritize and focus on the in-environment communication” (Liu & Steed, 2021). In earlier decades, such systems have been discussed as “collaborative social environments” (CVEs), but the arrival of mass consumer VR hardware has shifted terminology (Jonas et al., 2019). Social VR as a practice can be described as embodied social roleplaying within a system of connected and confined virtual 3D spaces2The terms “space”, “room”, “world” or sometimes “map” are often used interchangeably when talking about places inside social VR. I follow that mode of usage and reserve the term “platform” for speaking about the whole system of spaces, the infrastructure of which is most often run and owned by one company. I leave open source self hosted systems like Mozilla Hubs aside because they so far have not generated the community effects I am interested in. inhabited by avatars3For a closer look on avatars within social VR see Kolesnichenko et al. (2019). tethered to human users. Due to technical limitations, a single room on a social VR platform can currently usually host no more than about 50 people at the same time, which structurally encourages the dynamic creation and dissolving of social groups as well as their localization. Virtual rooms can be instanced multiple times in different social states: public, private, only open for friends or for people in possession of a link or token. Since digital assets can be copied, uniqueness in virtual environments is a rare good and consequently has – like presence – become decoupled from (virtual) materiality to mainly exist as a transient psycho-social fact: as an experience.

Most, though not all, social VR platforms focus on meeting and connecting with strangers and have thus implemented functions to build user networks, like friends or groups lists. Communication between users happens mostly verbal via microphone and through the expressiveness of avatar bodies via live VR body tracking or prerecorded movements, though other established media of social online communication like emojis and written chat are common as well. Almost all social VR platforms allow usage without a dedicated head mounted device (HMD) in order to lower entrance barriers and enable user growth – in fact the majority of people using the bigger social VR platforms currently are non VR users, because VR hardware is still relatively pricey and of limited everyday utility.

Since the development and deployment of the Oculus Rift HMD around 2013 started the consumer VR mainstreaming phase, numerous social worlds and platforms supporting and/or centering around the technology have been created. From weblogger Ryan Schultz’s more than 160 entries long list of VR capable social virtual worlds, only few have garnered a 4 or more digit user count though (Schultz, 2023). The two most prominent ones as of 2023 are Rec Room and VRChat. A comparison of these two protagonists can yield an understanding of how different concepts of aesthetic worldbuilding and user creation may influence community development in terms of culture and politics – which is crucial when thinking about what “immersive democracy” might mean or come to be.

The two following chapters will therefore give an overview over the genesis and characteristics of those two platforms. This overview is then followed by a rough description of the communities that have formed on each platform during the last years. The concluding chapter will discuss aesthetic governance as a process developing between design paradigms and community culture(s). The author largely relies on his own observations during unstructured preliminary field research on various social VR platforms in the years 2020-2022. In this research, significantly more time was spent on VRChat than on other platforms, which leads to an imbalance of experience that will become apparent later in the text. Full-fledged ethnographic research through participatory observation on social VR is still lacking, but some literature using qualitative methods like interviews (Freeman et al., 2020; McVeigh-Schultz et al., 2019; Shriram & Schwartz, 2017), guided group walkthroughs (Liu & Steed, 2021) or social media discourse analysis (Zheng et al., 2022) has been taken into account, as well as primary and secondary online sources.

Rec Room

HISTORY AND AVAILABILITY

In spring 2016, a group of six men – partly Microsoft employees that had formerly been working on the development of mixed reality device HoloLens – founded the company Against Gravity to release Rec Room. The application was marketed as a “virtual reality social club where you play active games against competitors from all around the world”4Cited from the original press release accompanying the application’s launch, archived under https://web.archive.org/web/20160620140618/http://www.againstgrav.com/press. and featured a number of different virtual spaces for users to play and socialize in. Most games were and are competitive in nature, and over the years simulations of typical physical sports games like dodgeball were surpassed in popularity by more martial ones like laser tag that embed the common “first-person shooter” experience of online gaming into the social sandbox. Rec Room’s name refers to its central social metaphor, which is also the source of its unified aesthetics: “a prototypical rec center from the year 1987” (McVeigh-Schultz et al., 2019).

Rec Room was initially released to meet the market entry of the new HTC Vive HMD, while also being availabe for the Oculus Rift and soon expanding availability to Playstation 4‘s VR system in late 2016. Since then, the software has become accessible for a fairly large number5“Fairly large” should be understood in comparison to other social VR apps. While technically, browser based platforms like Mozilla Hubs would be accessible from any device with a compatible browser and thus have the lowest threshold for entry and widest possible adoption, in practice companies controlling access to VR applications via their stores have been reluctant to include and sometimes actively excluded WebXR compatible browsers in order to limit non-proprietary platforms. of devices and operating systems: Windows PC desktops (either as a downloadable standalone application or via the digital distribution platform Steam)SteamVR compatible as well as Oculus Rift and (MetaQuest HMDs6Support for Quest 1 devices was discontinued in the first half of 2023 when Meta deprecated the relevant SDK., mobile iOS and Android devices, Xbox and PlayStationLinuxand macOS desktop devices are not supported.

ECONOMY AND ADOPTION

Likely due to its founders already being well-connected in the industry, Against Gravity started off with a seed funding from multi-billion-dollar venture capital firm Sequoia Capital in 2016 and was able to raise investments to almost $300 million until late 2021 – the bulk of which poured into the firm during the Covid-19 pandemic7Numbers cited from https://www.crunchbase.com/organization/against-gravity/company_financials [accessed 2023, December 5].. The company has since changed its name to Rec Room Inc. Like virtually all social VR platforms at the moment (and the bulk of social media platforms more generally), Rec Room is free to use, with a few advanced features only accessible to paying customers. An in-game economy with tokens to spend on items and clothing was included from the start, and custom creations of users made with the platform tools can be traded via those tokens on the platform. In 2020, the ability to purchase tokens with “real” money for an exchange rate set by the company was added, as well as a monthly subscription feature called “Rec Room Plus” that allows creators of in-game assets to cash out their earnings when they have reached a threshold of 250.000 tokens (currently converting to $100). On the virtualization side of economics, room creators can also create their own sub-currencies, which may then be traded against tokens8See https://recroom.com/roomcurrencies [accessed 2023, December 5].. The company calls their whole meta economy “Community Commerce” – a term for digital social commerce that has been gaining popularity in recent years especially with TikToks growing success – and promotes it to its users as a potential way to “making a sustainable income”9Cited from https://blog.recroom.com/posts/2021/10/12/community-commerce-report..

On the platform’s website, Rec Room boasts more than 60 million users in 2022. While an impressive figure, this amount of existing accounts is unlikely to reflect the number of real people actually using the platform, since it supposedly includes abandoned, multiple and otherwise inactive accounts. Occasionally, the company publishes numbers of their monthly active user count (MAU) at peak times to demonstrate its growth. In 2022, this peak was at a reported 3 million accounts having logged into the platform at any given time over the course of a month (Au, 2022b). Meant to demonstrate growing adoption, this number still does not give much information about the amount of time people spend on the platform and what they actually do there. Independent numbers are not available and would be hard to obtain from the outside, because users spread over thousands of single rooms.

AESTHETIC CONCEPT

Rec Room’s visual concept is a virtual youth nostalgia – not only with regards to the choice of its metaphorical location, but also in the sense that its founders are too young to have any memories of a US college or university recreation center in 1987 of their own. The virtual spaces provided by the platform itself, called “Rec Room Originals”, are dominated by warm colors and rounded shapes creating a family friendly10For these and the following descriptions compare McVeigh-Schultz et al. (2019), who have interviewed Rec Room designers about their decisions.nostalgic vibe. Simple materials and low-poly113D objects are usually built out of simple polygons. The number of polygons an object consists of limits its geometric complexity and correlates with the computational power needed for its visual rendering. Since technological development in graphics computation power is accompanied with a drive for higher fidelity 3D realism, simpler “low poly” aesthetics themselves have become associated with a nostalgic look. 3D objects ensure fluid rendering and interoperability across different devices and operating systems and also add to the overall retro aesthetics12YouTuber Retr0‘s video “The Evolution of Rec Room (Release, 2016 and 2017)” gives an impression of the aesthetic development, but also consistency over the years (Retr0, 2021)..

Fluid playability on mobile devices is also a major reason for the stylized humanoid user avatars on the platform not featuring any legs13Since consumer VR hardware commonly only provides movement tracking of three points – head and hands –, leg movement and positioning usually has to be inferred computationally. The company describes the rationale of the original avatar design in a blog post as follows: “We avoided showing untracked legs and arms because it could break the feeling of presence; we kept facial features cute and minimal to avoid the uncanny valley effect; and we chose simplicity over visual detail so the game ran smoothly” (https://blog.recroom.com/posts/avatars).. Platform users are represented through torsos floating above ground, with aligned but unconnected hands and head. These avatars can be customized individually inside the app with regard to their facial features, hairstyle, skin color, gender attributes, clothing and accessories. Stylized mouths with animations synchronized to the user’s microphone input make social interactions feel more “alive” and have been designed to predominantly convey a friendly expression. This design decision is a form of aesthetic nudging towards a more “positive” social atmosphere where, as one Rec Room developer put it, “everyone looks happy all the time” (McVeigh-Schultz et al., 2019).

Besides the “Rec Room Original” spaces/games developed by the company itself, users can build their own rooms out of an assortment of basic 3D elements and materials, as well as design custom avatar “costumes” and thereby body shapes. This is done with an in-game tool called “Maker Pen” – a stylized hot-glue pistol – and a visual scripting system called “Circuits” for interactive functionalities like buttons, dynamic architecture, collision detection or scoring systems. In 2023, an additional development kit called “Rec Room Studio” has been rolled out in a beta state. The kit allows for the import of environments and elements created inside the game engine Unity3D, thereby significantly expanding the 3D design options. If widely adopted, this is likely to break up the fairly unified aesthetics of Rec Room in the future. Rec Room Studio is on the one hand targeting companies that want to be present on the platform with their own corporate visual designs14There is a dedicated paragraph on the feature webpage adressing readers that “are a company or brand” (https://recroom.com/studio)., on the other hand it also can be understood as a reaction to the success of Rec Room’s direct competitor VRChat, which follows a different logic of aesthetic creation.

VRChat

HISTORY AND AVAILABILITY

VRChat was first released by software engineer Graham Gaylor for the then new Oculus Rift HMD in early 2014. Alongside the later discontinued platform Riftmax, the app quickly assembled a small community of VR enthusiasts using it for socializing, exploration, development and discussion in the early years of consumer VR. At the point of release, VRChat was in a very basic state, and it has retained its status of being an “early access” product being in development until now. Its core functionality was, and still is, the hosting and mediating of networked virtual co-presence through 3D avatars, leaving most of everything else to its users. Contrary to Rec RoomVRChat never had a unified aesthetic design concept: user-created content is hugely important to the platform and has been the main reason for its popularity.

Similar to most social VR platforms, VRChat is not limiting accessibility exclusively to users with VR hardware. Desktop clients for Windows and macOS were deployed early, with the latter being discontinued in the first half of 2016, when support for the newly released HTC Vive HMD via SteamVR was added. Client downloads directly from the VRChat homepage were phased out in the following years in favor of larger app stores tied to the different disjunct and competing VR device ecosystems. A combined PC desktop and VR version accessible via the Steam software platform’s early access program in mid 2017 subsequently led to drawing in more users who approached the application from a video-gaming perspective. There is no native support for Linux or macOS currently, but the alpha version of a mobile app for Android has been released in August 202315Announced at https://ask.vrchat.com/t/developer-update-17-august-2023/19495..

ECONOMY AND ADOPTION

Since its inception, the initial two-person LLC (Gaylor teamed up with programmer and game designer Jesse Joudrey shortly after the initial release to launch the company) has evolved to a business with several dozen reported full-time employees. VRChat Inc. has been financed through several funding rounds with about $95 million16Numbers from https://www.crunchbase.com/organization/vrchat/company_financials [accessed 2023, December 5]. To the author’s knowledge, the company has so far not disclosed revenue or valuation figures or even a business model. The application is largely free to use, with a subscription service called “VRChat Plus” offering exclusive or early access to select features, but the revenue from subscriptions is unlikely to support a significant part of the cost of infrastructure, support and development. The latest – and by far largest – funding round in 2021, providing the company with an $80 million backing led by US venture investment firm Anthos Capital17See previous footnote.,was linked in a company blog post to the ambition of further growing the user base and implementing a “creator-driven economy”18This was layed out in a blog post by VRChat “Head of Community” Tupper on behalf of “The VRChat Team & Investors” here: ([Tupper], 2021b)., i.e. mechanisms allowing for users to pay each other inside the platform. Such a payment infrastructure similar to the Rec Room“Community Commerce” would enable the company to profit off transaction fees that have so far been taken in by external platforms like BoothGumroad or Patreon, which have become hosts to the community’s lively informal content market economy (Au, 2021).

There is no comprehensive public data on VRChat’s monthly active users. Similarly to Rec Room Inc, the company is not interested in making its adoption and usage data transparent. Instead, it occasionally publishes new record highs of concurrent users, i.e. the maximum number of accounts logged in simultaneously at a select moment. Those were reported to be about 40.000 on New Years Eve 2020 (Tupper, 2021a) and more than twice that number one year later (Au, 2022a). There appears to be a silent agreement between the two competing platforms, while each publishing peaking user numbers in order to represent their successful growth and adoption, to choose those numbers in a way that prevents direct comparison. In combination with the hard-to-measure structure of a plethora of instanced virtual rooms on a platform, this makes it impossible to more than guess the sizes of actual user counts. Generally, VRChat’s total user base is often (but without transparent figures to back it up) assumed to be lower than that of its direct competitor, though with a higher percentage of actual VR hardware users due to its advanced motion tracking support. Steam usage statistics indicating PC desktop and VR users usually rank VRChat significantly higher than Rec Room19At the time of writing, the Steam user count for VRChat is roughly 20 times the one of Rec Room as per https://steamdb.info/charts/?category=53&select=1&compare=438100%2C471710 [accessed 2023, December 5]. but do not represent mobile or any other users not connecting via the servicewith the former being a significant part of Rec Room users according to the company20Rec Room representative reported in 2022 that “at this point VR is a pretty low percentage of our monthly players” and then referred to the bulk of users coming from various ecosystems not represented on Steam (Lang, 2022)..

On the technological side, VRChat supports more advanced VR hardware technology than most of its competitors, like up to 11-point full body tracking21In VR, a user’s position, posture and body movement is usually tracked at at least three points: the head via the HMD, one or two hands via the controllers or visual hand tracking systems. All other limbs are inferred through a plausibility system called inverse kinematics. Tracking accuracy can be increased by adding more points at the feet or between key joints like hips, knees or elbows. VRChat supports tracking devices that interface with Valve’s optical “lighthouse” system, but can also be expanded by solutions that are compatible with SteamVR’s protocols. See. https://docs.vrchat.com/docs/full-body-tracking [accessed 2023, December 5]., and features a generous scripting API. Despite prominent claims that “legs are hard”22“Seriously, legs are hard” was proclaimed by Meta’s Mark Zuckerberg on the “Meta Connect VR” conference in 2022 when announcing full body avatars, followed by the erroneous statement “[…] which is why other virtual reality systems don’t have them either” (Hern, 2022).in VR, VRChat avatars have long been able to accommodate not only legs with inverse kinematics and/or tracking, but also dynamically moving tails/hair/costume parts, advanced custom shaders, prerecorded movement animations and a wide range of avatar sizes. This has lead to the platform garnering a power user base of people willing and able to invest in VR hardware allowing for higher degrees of embodiment. Consequently, users with VR hardware and “screen” users without it can have very different experiences when using the platform, which sometimes leads to differing social behavior and contributes to cultural stratification along hardware ownership lines.

AESTHETIC CONCEPT

VRChat’s significant informal community content market, with users selling, trading and commissioning avatars and sometimes rooms among each other, is a result of its aesthetic production paradigm. The platform has encouraged and relied on user created content pretty much from the start by providing a software development kit (SDK) plugging into the free-to-use Unity3D game engine. Early on, VRChat founder Graham Gaylor expressed his belief that custom content creation was key to evolving metaverse applications, like it had been for social web platforms23See Thompson (2014) at minutes 17:22 & 48:56. – virtual environments and avatars being the equivalent to user generated text and image content on “web 2.0” social media. Like with these previous platforms, social VR’s appeal and worth would come to depend on its users’ creative labor24The knowledge threshold for user creation in 3D spaces still is significantly higher than on “classical” social media though, with a wider gap between content creation and social practice. This has let to e.g. unique avatars becoming a sought for commodity on VRChat..

The “look and feel” as well as the social dynamics on VRChat today are a direct consequence of this decision to have almost all content be generated25“Generated” may at the most basic not mean much more than uploaded by users – “stealing“/copying/recreating content from games, movies or single creators and other forms of copyright infringement are not uncommon, much like in other online spaces with liberal content politics, where enforcement of IP legislation is at odds with a business’ intrinsic in growth through cultural adoption. by users. The first VRChat application itself had been quickly assembled in Unity3D by Gaylor, using free-to-use scenes from the Unity Asset Store as environments and a simple humanoid avatar in T pose as readymades for testing out the functionality of networked VR. Since there were no aesthetic parameters but only technical limitations, interested users soon began experimenting intensely with possibilities and limitations for imagining and creating avatars and spaces, using the provided SDK. With a growing influx of “very online” users in the following years came recreations of popular games, pop culture figures and memes. Especially avatars became a kind of social trading good in the community, sometimes spreading very fast and creating memetic phenomena spilling into wider online culture. Over time, VRChat users developed a deliberate aesthetic eclecticism that also made the platform increasingly attractive for content creators on video and streaming platforms like YouTube and Twitch, who thus became another part of the developing informal cultural economy.

VIRTUAL COMMUNITIES

VRChat’s eclecticism and avatar affordances have become a breeding ground for distinctive and overlapping communities around identities and practices with a high emphasis on embodied aesthetics, like

  • a long-standing clubbing/partying scene as well as a dedicated dancing community focused on e-girl & e-boy styles,
  • a transgender community using the affordances of virtual “morphological freedom“26Founder of the VRChat “Trans Academy” Tizzy in an interview with VTuber Phia: “[I]n 2016, when I was looking to have facial feminization surgery, I brought a screenshot of my second life avatar because it was the person that I felt the most comfortable and happy as. That might seem a little bit taboo now but I think that as social VR and the metaverse become more of an integral part of our society in the future, we’re going to see a lot more people prototyping their identity in these spaces and embracing the idea of having morphological freedom” (Bollinger, 2023). and sharing advice on gendered body movement and voice training,
  • a virtual furry community enjoying the low entry threshold of VR avatars as opposed to the high prices of physical fursuits, with the last convention of this community on the platform according to the organizers having more than 15.000 participants,
  • a diverse roleplaying community with different game worlds and stories as well as meta-roleplaying troupes with a high mobility on the platform like the “Loli Police Department”,
  • and a meme community that strongly influenced the platform’s public image because of its attractiveness for live streamers.

The latter’s appeal for underage users and people close to online trolling and “shitposting” culture and their presence in public VRChat rooms has driven other local communities to largely avoid public worlds and rely on non-public rooms and invitation mechanics, operating their own events and social spaces within the platform’s wider ecosystem. This dynamic has begun to create something akin to a VRChat society, where interest groups negotiate their sometimes aligned, sometimes conflicting interests through different channels.

VRChat is also frequently referred to as having been instrumental in developing distinct virtual socio-physical practices and conventions: “headpatting” as a gesture of affection, silent rooms where users can doze or sleep while wearing their HMDs, and a growing number of users engaging in erotic roleplay (ERP) in VR. The latter has been met with concern by some longer-term platform users because it amplifies or contributes to a growing sexualization of avatars on the platform27Arguably, sexualization is part of the complex intercultural history of anime aesthetics at large, so this tendency was prevalent in a community relying heavily on those aesthetics for their avatars pretty much from the start. It only seems to have become problematic for this community though when combined and thus increasingly identified with actual socio-sexual practices – a process that in itself would make an interesting case for exploring the differential value judgments at play in communities forming around visual representations of bodies, identities and desire..

All of these practices as well as their exemplary sub-communities share a strong connection with corporeality. Thanks to its advanced tracking support VRChat has become one of few platforms that can accommodate the aesthetic realization of this relationship with and desire for embodiment, where “physical bodies [are] the immediate and sole interface between [users] and their avatars” (Freeman et al., 2020). The relatively large degree of technical freedom in the creation especially of avatars has also given VRChat a long history of hacks and so-called “crashers” – code-based modifications that can be employed as a weapon to freeze or kick other users out of the game, sometimes in quite elaborate and aesthetically overwhelming ways. Especially crashers operating with shader programming combine the affective experience of being forcefully ejected from a (virtual) social reality with an intense aesthetic overload likely to provoke strong physical reactions in HMD users: they not only crash the software, they go for the sensory system of its users, too.

Like avatars in general, such crashers have long been traded among VRChat users, be it for offensive trolling or for self defense purposes. The technical affordances allowing for such virtual weapons as well as the comparably weak content moderation on the platform have made many community members somewhat resilient to attacks, insults, flaming etc., making them regard harassment as an annoying, yet not truly avoidable social phenomenon at least in public worlds. The danger of being attacked or insulted is seen as a trade-off for the power of forming, defining and developing community and community aesthetics “from the ground up”. The aesthetic sandbox is a social sandbox as well, where too many preventive restrictions are undesired even by users experiencing harassment, “as they might prevent the open dialogues that drew users to the technology in the first place” (Shriram & Schwartz, 2017).

Contrastingly, Rec Room communities, with their limitations in avatar design, have developed less around virtual corporeality and more around playful practices. Many users are heavily invested in the games the platform is offering – also because especially the “Rec Room Original” pvp games like paintball or laser tag work really well from a vsports28While the term “vsports” seems to be not in use yet, it makes a lot of sense to distinguish virtual sports activities with their emphasis on whole body movement from egaming/esports that require more isolated hand-eye-coordination.  perspective. Generally, users seem to follow the central metaphor and conceptual idea of Rec Room as a “social club” around sports activities, and also partake in the regular special events the company designs around tasks and token/item collection, sometimes building narrative around the fictional platform universe. But there is also a creative community focused on building worlds, costumes or painting in Rec Room, as well as sub-communities based on aesthetic creation, like (military) roleplaying or popcultural fandom. For creators, being confined by the narrower aesthetic limits of the platform is a creative challenge balanced by the entanglement of attention and token economy. Lastly, like in VRChat, there are also identity-centered communities/servers for LGBTQ or furry users, although they appear to be less prominent.

When, in 2023, Rec Room announced the upcoming integration of full body avatars (i.e. bodies with legs) and single finger movement, a significant portion of users seemed rather wary of such changes29For an exemplary discussion among Rec Room users that mostly focuses on the aesthetics of single fingers, see: https://www.reddit.com/r/RecRoom/comments/143hytj/what_are_your_opinions_on_rec_room_having_hand/. Especially longer time users seem to identify with the stylized aesthetics of the platform and take a rather conservative stance towards changing the simplified look. When discussing, users regularly invoke VRChat as the aesthetic negative to their own appreciation of Rec Room, emphatically describing the dread they feel when confronted with VRChat’s radical aesthetic inconsistency of avatars and worlds. In contrast, they value the stable and defined aesthetic normality across the Rec Room universe, for it allows them to concentrate on the core activities of gaming and socializing.

GOVERNANCE

For the purpose of this text, I assume a correlation between aesthetic and social regulation of social VR platforms as a working hypothesis. If such a correlation, however complicated by the fuzziness of cultural processes, existed, then we would assume spending time in Rec Room an experience significantly less likely to be socially disrupted or stressful. Indeed, the platform is not only more coherent, it also has more developed moderation/policing features than VRChat: there is a system of appointing and rewarding community moderators, a third party algorithm is actively surveilling users’ speech for forbidden words30See company blog posts at https://recroom.zendesk.com/hc/en-us/articles/4419902650135-Applying-for-Moderator-Volunteer-Mod and https://blog.recroom.com/posts/2021/11/19/ensuring-be-excellent-to-each-other. and features like an embodied gesture for quickly blocking other users in threatening situations speak of user safety being considered on a variety of levels. It is no wonder then that in academic literature on social VR, Rec Room is being discussed more prominently and also more positively than VRChat when it comes to questions of safety and harassment31One literature review e.g. lists VRChat explicitly as “known for harassment and unpredictable social encounters” in a long table of otherwise neutral or advertisement-like descriptions of different platforms’ functionalities/USPs (Handley et al., 2022)., with the latter usually being characterized as a “wild west” (McVeigh-Schultz et al., 2019) “known for non-normative social interactions” (Zheng et al., 2022).

While intuitively plausible, there might also be some bias at play here. Academic research on social VR, when more than pure literature review, has so far concentrated on design features and harassment as a potential design problem. Skimming through a number of papers and their methodologies shows that researchers spend surprisingly little time on the platforms they are writing about. There is a serious lack of ethnographies about and on social VR platforms that would enable to learn how those platforms’ users make sense of and navigate the social space(s) they inhabit and, for the most part, create32In addition, Rec Room developers and other company staff seem to be much more accessible for interviews with researchers, which also leads to a certain representational bias.. Harassment is one part of this social space and users respond to it within the frame of the more general community politics, explicit policies and tacit rules of their specific platform – their response is part of the “attendant literacies, interaction conventions, and common practices that exist in a feedback loop between the (topdown) designed affordances of various online social platforms and the (bottom-up) practices of virtually embodied players seeking to communicate” (Tanenbaum et al., 2020).

In reality, hate speech is a problem on both Rec Room and VRChat as much as bullying of certain user groups like e.g. furries33Searching for “furries rec room” on YouTube yields plenty of videos with titles like “trolling furries on rec room”, “Killing furries in Rec Room”, “Making furries cry in Rec Room”, “Infiltrating Furry Rec Room Servers” etc. is – despite the different grades of moderation and implementation of safety features. On both platforms, it does not take long to encounter nazi roleplaying or discriminatory talk. On both platforms, sexual harassment is a problem evolving from its already prevalent and well described occurrences in virtual social spaces in the wider sense into the new affordances of embodiment and immersion of VR technology – a problem that is made even more pressing by the significant presence of minors. Additionally, underage users themselves form, on both platforms, a group that many older members see more as annoying (“squeakers”) than as vulnerable, which can lead to social tension as well.

As has often been established for all sorts of virtual environments in the wider sense, such social problems will keep appearing and shape-shifting in online spaces as long as they exist in the so-called “real world”. While design relevant, they are not design solvable problems. “[I]ntensified old concerns in the new world” (Zheng et al., 2022), they now appear in a context with new conditions and possible complications. This new context is on the one hand defined by the more intense bodily experience of interactions in virtual reality as a medium with the consequence of “less boundaries […] that can rule and determine what are reasonable, psychologically safe and permissible ways for other people to behave around self and how self will respond when someone steps outside those limits” (Zheng et al., 2022). But this context also carries the vectors and effects of the platforms’ differing creative/aesthetic paradigms. How can these paradigms be described when thinking about governance?

Of the two examples regarded in this text, Rec Room seems to fit the top down model of a benevolent ruler. “Rooms are behavior”, as one developer put it in an interview (McVeigh-Schultz et al., 2019), and the company retains relatively much control over the social cues they allow virtual spaces to give users on their platform. Communitization takes place around competitive playful activities, mediated by a ubiquitous gamified economy and within a unifying aesthetic atmosphere regulating the expressions users are able and allowed to make. While following a platform structure of thousands of parallel and synchronous bounded virtual rooms, the centralization of important conditions of social experience within these rooms hedges boundary testing experiments as well as unwanted violations of the social contract. In that, Rec Room policy follows Blackwell et al.‘s recommendation that “designers could directly influence the norms of individual communities and groups through design ‚nudges’ ” (Blackwell et al., 2019) – a socio-aesthetic technology of governance hat has implications far beyond the scope of dealing with harassment. This is ever more true because Rec Room’s vision of “democratization” has from its inception been linked closely to monetization through community commerce[1]: it is at its core an economic experiment. In consequence, community politics “on the ground” appear to develop between the poles of an aesthetic conservatism shying from “too much” diversity and a growing consciousness about the stratification effects and exploits of the platform’s token economy34Rec Room’s General Partner at main investor Sequoia Capital describes the platform’s vision of building community around games “both for fun and to earn money” in a blog post celebrating the Series D funding round like this: “Rec Room’s vision is to democratize access for anyone to create with the most sophisticated yet simple-to-use creator tools (no coding required!). The team is also excited to launch P2P monetization to enable creators to monetize their own creations — enabling the new side hustle for kids” (Zhan, 2020)..

In contrast, VRChat’s focus on embodiment effects and a very liberal user generated asset production has sprung a multiplicity of partly overlapping, partly averse sub-communities that have made the platform something like the Reddit of VR. In an equally liberal low-moderation environment, members of those communities often have developed platform-specific resilience against equally platform-specific threats. The lively and sometimes unhinged creativity of community members has influenced the pop cultural image of social VR more than existing research has acknowledged, and VRChat communities politicize mainly around the conditions for this appeal – especially when they find them endangered. The company had to acknowledge this in mid 2022, when users became enraged about a new anti-cheat function that was meant to prevent tampering with the client software but effectively barred an entire modding community that had also taken responsibility for providing users with impaired eyesight or hearing access to the platform; a rage that manifested in large scale review bombing35Community Vtuber BVR proposed a system of upper, middle and lower classes depending on users’ token wealth in a video titled “Why is Everything SO EXPENSIVE in Rec Room?” (BVR, 2022), assigning content creators to the wealthiest class. Road to VR editor Scott Hayden pointed to the risk of “gambling, money laundering, and other illicit behavior” within Rec Room in 2020 already (Hayden, 2020). and a number of active and creative users leaving for smaller competitors like Neos VR or Chillout VR.

While methodically robust ethnographic research is yet to be desired, it seems a plausible hypothesis that the less safe and less regulated environment of VRChat has led to a higher degree and valuation of self governance in most of the platform’s communities36Thousands of furious reviews by users temporarily lowered VRChat’s Steam rating to “mostly negative”, prompting coverage in the gaming press and beyond to conjure apocalyptic imagery like of the platform “being absolutely nuked into the ground” (Taylor, 2022).. The vector of this form of community self governance is protection of the peer group and the self against the dangers of the platform’s evolving social ecosystem: it tends to produce entry barriers and exclusion mechanics around the community itself. If this tendency becomes too strong, neglect of the social space between communities – an equivalent to the democratic concept of public space on a metaverse platform – might become a problem for social revitalization of communities as well as for user and company growth at large, because it is these liminal communal rooms where onboarding of new users commonly happens.

Both differing platform cultures and models of governance provide starting points for thinking about how democratic structures might develop and be stabilized in virtual worlds employing VR technology. While their structural trajectories seem to partially converge – Rec Room opening up aesthetically with a new Unity SDK, VRChat working towards integrated community commerce –, it is yet to be seen what role their different community cultures will play in said conversion. This is of interest also because what happens in the space of VR social technology adoption has wider implications for an increasingly virtualized social reality as envisioned by “metaverse” evangelists: If VR technology finds more users, social VR ecology will likely have been a large scale model of further digital community politics to come.

Reflecting on the structural role of possibilities and limits of aesthetic creation in VR, of how it forms the basis for making sense of and representing bodies and worlds, of its entanglement with economic flows and the production of social order acknowledges the intuition that “the affordances that designers and other practitioners deem important will inevitably shape an extensive portion of human social interactions today and in the future” (Kolesnichenko et al., 2019). Design decisions for social worlds are always political decisions, and aesthetic governance is an important part of intersectional affective biopolitics in a mediatized world. It has been conceptualized for urban planning (Ghertner, 2015) or (social) media studies (A. Elias et al., 2017) and becomes even more relevant where the virtual production of space, bodies and sociality merge.

If we regard the current two largest social VR platforms as for how their different paradigms of worldbuilding and aesthetic creation relate to democratic culture, we cannot ignore the fact that both platforms are proprietary infrastructures run by competing private companies – spawning and harboring social communities is their mode of redeeming the venture capital invested in them. With the economical allure of the “metaverse” being the redesigning and virtualizing of the social for an increased extraction of value, both platforms serve as examples of possible pathways towards the likely conflictual realization of this goal. These pathways differ right from start – one beginning as an integrated business concept with thoughtful planning, the other as an experiment growing out of a VR tech enthusiast community trying and often struggling to keep up with its own development.

Paradoxically, while Rec Room thus takes on the “classical” role of a governing state much more than VRChat – setting and enforcing social policies, controlling the economic infrastructure, regulating the possible and impossible relations of what is “normal” and what is not, ensuring fairly equal access for different (hardware) populations –, its users seem to regard it more as a regular online game provider than those of VRChattreat “their” platform. This might for one be due to the fact that the libertarian plurality of VRChat indeed resembles the current image of a neoliberalized democracy more than the “all fun and games” uniformity of Rec Room does, down to the rituals of partaking in mass demonstrations (like in the recent review bombing mentioned earlier) or performing the disgruntled citizen alienated from “the powers that be”. The more powerful element charging this relationship though might be the higher grade of embodiment afforded by the platform, tethering its core user base much more intensely to the experience of having a second body living a second social life in a second reality. In short, they choose the platform not for their leisure or for monetary gain, but because it allows them to realize themselves on multiple levels – to become. If the claim to diversity and plurality of current (liberal) democracies is to be taken seriously, then this indicates that these concepts will mean more in social VR than choosing the skin color and gender attributes of an otherwise standardized 3D comic character or even embodying a “realistic” 3D scanned copy of ones own physical body: it rather means the ability and opportunity to access the “morphological freedom” the technology promises in the first place.

On another note, the economic aspects of this freedom have only begun being tested. Who controls the infrastructures facilitating the production and trade of virtual bodies? What does body ownership in VR mean not as a psychological effect, but as a social question negotiated between fast-swapping dozens of freely copyable avatars as a communicative practice on the one hand and identifying with a unique virtual body, demanding structural protection of its integrity and uniqueness, on the other hand? Who will profit off being able to have a body in VR to start with? Will certain ways of looking be valued and prized higher than others, as is true for much of the physical world, or will beauty and its valorization become subject to a radical re-negotiation amongst bodies-as-humans, bodies-as-animals, bodies-as-objects, bodies-as-rooms and as of yet other unimaginable forms of being or being-experienced?

Companies invested in building a “metaverse” fully replacing the “real world” as the primary realm of the social37Rec Room cofounder Nick Fajt’s vision as relayed by their main provider of investment capital: “[W]hereas the the [sic!] last era of social centered on sharing real world experiences online, the next era of social would be centered on both creating and sharing these moments online” (Zhan, 2020). are quick to acknowledge that platforms that “enable anybody to create and share their own social virtual worlds […] shouldn’t be built privately, but rather alongside a passionate community who can help shape the future”38Quoted from the developer statement about VRChat’s “early access” status on Steam: https://store.steampowered.com/app/438100/VRChat/ [accessed 2023, December 5]. While it stands to reason that platforms are eager to enlist their users’ labor for building their virtual realities, it is yet another question who will actually own them. The more the actual fabric of a platform consists of results of its users’ creative labor, the more contested this question will be. Asking for the distribution and implementation of aesthetic governance can give us hints on how it could or should be answered. 

REFERENCES

Au, W. J. (2021, September 22). Inside VRChat’s Informal Economy: An Estimated 350+ People Make Real Money from that Metaverse—Many Likely Earning a Full-Time Income! UPDATE: More Like Over 1000 [Weblog]. New World Notes. https://nwn.blogs.com/nwn/2021/09/vrchat-community-economy-gumroad-patreon-youtube-1.html

Au, W. J. (2022a, January 4). VRChat User Concurrency Hit Nearly 90,000 Last New Year’s Eve! [Weblog]. New World Notes. https://nwn.blogs.com/nwn/2022/01/vrchat-concurrency-2021.html

Au, W. J. (2022b, April 13). BREAKING: Rec Room Has Over 3 Million Monthly Active VR Users, Most on Meta’s Quest 2; That’s 10x More Users Than Meta’s Horizon Worlds & Venues! [Weblog]. New World Notes. https://nwn.blogs.com/nwn/2022/04/rec-room-vr-mau-2022-install-base.html

Blackwell, L., Ellison, N., Elliott-Deflo, N., & Schwartz, R. (2019). Harassment in Social Virtual Reality: Challenges for Platform Governance. Proceedings of the ACM on Human-Computer Interaction3(CSCW), 1–25. https://doi.org/10.1145/3359202

Bollinger, S. (2023, November 4). Why are there SO many Trans people in VRChat? Gender, Identity, and Self Discovery. [Video]. YouTube. https://www.youtube.com/watch?v=Mh3HTTa5NFU

BVR. (2022, September 29). Why is Everything SO EXPENSIVE in Rec Room? [Video]. YouTube. https://www.youtube.com/watch?v=dnkq0m2lEeA

Elias, A., Gill, R., & Scharff, C. (2017). Aesthetic Labour: Beauty Politics in Neoliberalism. In A. S. Elias, R. Gill, & C. Scharff (Hrsg.), Aesthetic Labour (S. 3–49). Palgrave Macmillan UK. https://doi.org/10.1057/978-1-137-47765-1_1

Freeman, G., Zamanifard, S., Maloney, D., & Adkins, A. (2020). My Body, My Avatar: How People Perceive Their Avatars in Social Virtual Reality. Extended Abstracts of the 2020 CHI Conference on Human Factors in Computing Systems, 1–8. https://doi.org/10.1145/3334480.3382923

Ghertner, D. A. (2015). Rule by aesthetics: World-class city making in Delhi. Oxford University Press.

Handley, R., Guerra, B., Goli, R., & Zytko, D. (2022). Designing Social VR: A Collection of Design Choices Across Commercial and Research Applications. https://doi.org/10.48550/ARXIV.2201.02253

Hayden, S. (2020, December 1). ‘Rec Room’ Now Lets Premium Users Sell Creations for In-game Currency [Weblog]. Road to VR. https://www.roadtovr.com/rec-room-sell-items-tokens-economy/

Hern, A. (2022, October 12). Meta’s virtual reality project will finally have legs – literally. The Guardian. https://www.theguardian.com/technology/2022/oct/12/meta-virtual-reality-legs-mark-zuckerberg-facebook

Jonas, M., Said, S., Yu, D., Aiello, C., Furlo, N., & Zytko, D. (2019). Towards a Taxonomy of Social VR Application Design. Extended Abstracts of the Annual Symposium on Computer-Human Interaction in Play Companion Extended Abstracts, 437–444. https://doi.org/10.1145/3341215.3356271

Kolesnichenko, A., McVeigh-Schultz, J., & Isbister, K. (2019). Understanding Emerging Design Practices for Avatar Systems in the Commercial Social VR Ecology. Proceedings of the 2019 on Designing Interactive Systems Conference, 241–252. https://doi.org/10.1145/3322276.3322352

Lang, B. (2022, April 13). Virtual Social Platform ‘Rec Room’ Hits 3 Million Monthly Active VR Users [Weblog]. Road to VR. https://www.roadtovr.com/rec-room-monthly-active-vr-users-3-million-peak/

Liu, Q., & Steed, A. (2021). Social Virtual Reality Platform Comparison and Evaluation Using a Guided Group Walkthrough Method. Frontiers in Virtual Reality2, 668181. https://doi.org/10.3389/frvir.2021.668181

McVeigh-Schultz, J., Kolesnichenko, A., & Isbister, K. (2019). Shaping Pro-Social Interaction in VR: An Emerging Design Framework. Proceedings of the 2019 CHI Conference on Human Factors in Computing Systems, 1–12. https://doi.org/10.1145/3290605.3300794

Retr0. (2021, March 28). The Evolution of Rec Room (Release, 2016 and 2017) [Video]. YouTube. https://www.youtube.com/watch?v=PH7Wsa2V1Nk

Schultz, R. (2023, September 21). Welcome to the Metaverse: A Comprehensive List of Social VR/AR Platforms and Virtual Worlds (Including a List of Blockchain Metaverse Platforms) [Weblog]. News and Views on Social VR, Virtual Worlds, and the Metaverse. https://ryanschultz.com/list-of-social-vr-virtual-worlds/

Shriram, K., & Schwartz, R. (2017). All are welcome: Using VR ethnography to explore harassment behavior in immersive social virtual reality. 2017 IEEE Virtual Reality (VR), 225–226. https://doi.org/10.1109/VR.2017.7892258

Slater, M. (2009). Place illusion and plausibility can lead to realistic behaviour in immersive virtual environments. Philosophical Transactions of the Royal Society B: Biological Sciences364(1535), 3549–3557. https://doi.org/10.1098/rstb.2009.0138

Slater, M., Spanlang, B., Sanchez-Vives, M. V., & Blanke, O. (2010). First Person Experience of Body Transfer in Virtual Reality. PLoS ONE5(5), e10564. https://doi.org/10.1371/journal.pone.0010564

Tanenbaum, T. J., Hartoonian, N., & Bryan, J. (2020). “How do I make this thing smile?“: An Inventory of Expressive Nonverbal Communication in Commercial Social Virtual Reality Platforms. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–13. https://doi.org/10.1145/3313831.3376606

Taylor, M. (2022, July 26). VRChat is being review bombed to hell and back after anti-cheat announcement. PC Gamer. https://www.pcgamer.com/vrchat-is-being-review-bombed-to-hell-and-back-after-anti-cheat-announcement/

Thompson, G. S. (2014, May 19). Virtually Incorrect with Gunter—Episode #5 [Video]. YouTube. https://www.youtube.com/watch?v=jxQRzOckQu0

Tupper. (2021a, January 5). VRChat’s New Years 2021—Or, what the $%@& was that? [Weblog]. https://medium.com/vrchat/vrchats-new-years-2021-or-what-the-was-that-d84334789f77

Tupper. (2021b, June 25). VRChat Partners with Anthos Capital to Close $80M Series D [Weblog]. https://medium.com/vrchat/vrchat-partners-with-anthos-capital-to-close-80m-series-d-d706a1ab481b

Zhan, S. (2020, December 18). RecRoom: A digital third place for kids at heart [Weblog]. https://medium.com/sequoia-capital/a-digital-third-place-for-kids-at-heart-2646dfd55e00

Zheng, Q., Wang, L., Do, T. N., & Huang, Y. (2022). Facing the Illusion and Reality of Safety in Social VR. CHI Conference on Human Factors in Computing Systems (CHI ’22), April 29-May 5, 2022, New Orleans, LA, USA, New York, NY, USA. https://doi.org/10.48550/arXiv.2204.07121.

  • 1
    In this text, the term “VR” denotes technologically mediated immersive digital 3D environments, while the word “virtual” may in a wider sense also refer to other non-physical/online spaces, communities, practices or phenomena.
  • 2
    The terms “space”, “room”, “world” or sometimes “map” are often used interchangeably when talking about places inside social VR. I follow that mode of usage and reserve the term “platform” for speaking about the whole system of spaces, the infrastructure of which is most often run and owned by one company. I leave open source self hosted systems like Mozilla Hubs aside because they so far have not generated the community effects I am interested in.
  • 3
    For a closer look on avatars within social VR see Kolesnichenko et al. (2019).
  • 4
    Cited from the original press release accompanying the application’s launch, archived under https://web.archive.org/web/20160620140618/http://www.againstgrav.com/press.
  • 5
    “Fairly large” should be understood in comparison to other social VR apps. While technically, browser based platforms like Mozilla Hubs would be accessible from any device with a compatible browser and thus have the lowest threshold for entry and widest possible adoption, in practice companies controlling access to VR applications via their stores have been reluctant to include and sometimes actively excluded WebXR compatible browsers in order to limit non-proprietary platforms.
  • 6
    Support for Quest 1 devices was discontinued in the first half of 2023 when Meta deprecated the relevant SDK.
  • 7
    Numbers cited from https://www.crunchbase.com/organization/against-gravity/company_financials [accessed 2023, December 5].
  • 8
    See https://recroom.com/roomcurrencies [accessed 2023, December 5].
  • 9
  • 10
    For these and the following descriptions compare McVeigh-Schultz et al. (2019), who have interviewed Rec Room designers about their decisions.
  • 11
    3D objects are usually built out of simple polygons. The number of polygons an object consists of limits its geometric complexity and correlates with the computational power needed for its visual rendering. Since technological development in graphics computation power is accompanied with a drive for higher fidelity 3D realism, simpler “low poly” aesthetics themselves have become associated with a nostalgic look.
  • 12
    YouTuber Retr0‘s video “The Evolution of Rec Room (Release, 2016 and 2017)” gives an impression of the aesthetic development, but also consistency over the years (Retr0, 2021).
  • 13
    Since consumer VR hardware commonly only provides movement tracking of three points – head and hands –, leg movement and positioning usually has to be inferred computationally. The company describes the rationale of the original avatar design in a blog post as follows: “We avoided showing untracked legs and arms because it could break the feeling of presence; we kept facial features cute and minimal to avoid the uncanny valley effect; and we chose simplicity over visual detail so the game ran smoothly” (https://blog.recroom.com/posts/avatars).
  • 14
    There is a dedicated paragraph on the feature webpage adressing readers that “are a company or brand” (https://recroom.com/studio).
  • 15
  • 16
    Numbers from https://www.crunchbase.com/organization/vrchat/company_financials [accessed 2023, December 5].
  • 17
    See previous footnote.
  • 18
    This was layed out in a blog post by VRChat “Head of Community” Tupper on behalf of “The VRChat Team & Investors” here: ([Tupper], 2021b).
  • 19
    At the time of writing, the Steam user count for VRChat is roughly 20 times the one of Rec Room as per https://steamdb.info/charts/?category=53&select=1&compare=438100%2C471710 [accessed 2023, December 5].
  • 20
    Rec Room representative reported in 2022 that “at this point VR is a pretty low percentage of our monthly players” and then referred to the bulk of users coming from various ecosystems not represented on Steam (Lang, 2022).
  • 21
    In VR, a user’s position, posture and body movement is usually tracked at at least three points: the head via the HMD, one or two hands via the controllers or visual hand tracking systems. All other limbs are inferred through a plausibility system called inverse kinematics. Tracking accuracy can be increased by adding more points at the feet or between key joints like hips, knees or elbows. VRChat supports tracking devices that interface with Valve’s optical “lighthouse” system, but can also be expanded by solutions that are compatible with SteamVR’s protocols. See. https://docs.vrchat.com/docs/full-body-tracking [accessed 2023, December 5].
  • 22
    “Seriously, legs are hard” was proclaimed by Meta’s Mark Zuckerberg on the “Meta Connect VR” conference in 2022 when announcing full body avatars, followed by the erroneous statement “[…] which is why other virtual reality systems don’t have them either” (Hern, 2022).
  • 23
    See Thompson (2014) at minutes 17:22 & 48:56.
  • 24
    The knowledge threshold for user creation in 3D spaces still is significantly higher than on “classical” social media though, with a wider gap between content creation and social practice. This has let to e.g. unique avatars becoming a sought for commodity on VRChat.
  • 25
    “Generated” may at the most basic not mean much more than uploaded by users – “stealing“/copying/recreating content from games, movies or single creators and other forms of copyright infringement are not uncommon, much like in other online spaces with liberal content politics, where enforcement of IP legislation is at odds with a business’ intrinsic in growth through cultural adoption.
  • 26
    Founder of the VRChat “Trans Academy” Tizzy in an interview with VTuber Phia: “[I]n 2016, when I was looking to have facial feminization surgery, I brought a screenshot of my second life avatar because it was the person that I felt the most comfortable and happy as. That might seem a little bit taboo now but I think that as social VR and the metaverse become more of an integral part of our society in the future, we’re going to see a lot more people prototyping their identity in these spaces and embracing the idea of having morphological freedom” (Bollinger, 2023).
  • 27
    Arguably, sexualization is part of the complex intercultural history of anime aesthetics at large, so this tendency was prevalent in a community relying heavily on those aesthetics for their avatars pretty much from the start. It only seems to have become problematic for this community though when combined and thus increasingly identified with actual socio-sexual practices – a process that in itself would make an interesting case for exploring the differential value judgments at play in communities forming around visual representations of bodies, identities and desire.
  • 28
    While the term “vsports” seems to be not in use yet, it makes a lot of sense to distinguish virtual sports activities with their emphasis on whole body movement from egaming/esports that require more isolated hand-eye-coordination. 
  • 29
    For an exemplary discussion among Rec Room users that mostly focuses on the aesthetics of single fingers, see: https://www.reddit.com/r/RecRoom/comments/143hytj/what_are_your_opinions_on_rec_room_having_hand/
  • 30
  • 31
    One literature review e.g. lists VRChat explicitly as “known for harassment and unpredictable social encounters” in a long table of otherwise neutral or advertisement-like descriptions of different platforms’ functionalities/USPs (Handley et al., 2022).
  • 32
    In addition, Rec Room developers and other company staff seem to be much more accessible for interviews with researchers, which also leads to a certain representational bias.
  • 33
    Searching for “furries rec room” on YouTube yields plenty of videos with titles like “trolling furries on rec room”, “Killing furries in Rec Room”, “Making furries cry in Rec Room”, “Infiltrating Furry Rec Room Servers” etc.
  • 34
    Rec Room’s General Partner at main investor Sequoia Capital describes the platform’s vision of building community around games “both for fun and to earn money” in a blog post celebrating the Series D funding round like this: “Rec Room’s vision is to democratize access for anyone to create with the most sophisticated yet simple-to-use creator tools (no coding required!). The team is also excited to launch P2P monetization to enable creators to monetize their own creations — enabling the new side hustle for kids” (Zhan, 2020).
  • 35
    Community Vtuber BVR proposed a system of upper, middle and lower classes depending on users’ token wealth in a video titled “Why is Everything SO EXPENSIVE in Rec Room?” (BVR, 2022), assigning content creators to the wealthiest class. Road to VR editor Scott Hayden pointed to the risk of “gambling, money laundering, and other illicit behavior” within Rec Room in 2020 already (Hayden, 2020).
  • 36
    Thousands of furious reviews by users temporarily lowered VRChat’s Steam rating to “mostly negative”, prompting coverage in the gaming press and beyond to conjure apocalyptic imagery like of the platform “being absolutely nuked into the ground” (Taylor, 2022).
  • 37
    Rec Room cofounder Nick Fajt’s vision as relayed by their main provider of investment capital: “[W]hereas the the [sic!] last era of social centered on sharing real world experiences online, the next era of social would be centered on both creating and sharing these moments online” (Zhan, 2020).
  • 38
    Quoted from the developer statement about VRChat’s “early access” status on Steam: https://store.steampowered.com/app/438100/VRChat/ [accessed 2023, December 5].

Police handling of hate crime: A pilot project to use VR technology for professional development in sensitizing police officers to the experiences of victims of bias crime in Hamburg

Prof. Dr. Eva Groß 

Professor of Criminology and Sociology at the Hochschule der Akademie der Polizei Hamburg. Her research focuses on group-related misanthropy, bias crime (hate crime), victimization/dark field, (online) radicalization, police, perceptions of crime, economization of the social sphere, and institutional anomie.

Prof. Dr. Ulrike Zähringer

Professor of Criminology and Criminal Law at the Hochschule der Akademie der Polizei Hamburg. Her research focuses on the emergence of prejudice, (serious) violent crime, and violence against children.

Dr. Anabel Taefi 

Sociologist and research assistant at the Hochschule der Akademie der Polizei Hamburg. Her research interests include developmental criminology, (serious) juvenile crime, violent crime, and the emergence of prejudice.

Immersive technologies, such as virtual reality-based training, offer new opportunities for experience-based learning. Learners can immerse themselves in virtual or fictional worlds, making the experience feel real and allowing them to fully participate in it. These technologies also have great potential for basic and advanced police training in various subject areas, particularly in addressing bias crime, which is the focus here.The quality of police interactions with minority communities can greatly impact the level of trust these communities have in both the police and the state. Trust in institutions is an important measure of the quality of a democratic society, as it is intertwined with diversity and participation. This article demonstrates the potential benefits of immersive technologies in helping the police handle victims of bias-motivated crimes, which are often targeted at members of minority communities. Improving the professionalism of the police in this manner is a crucial step in building trust within minority communities and strengthening democratic resilience. The article then moves on to discuss a pilot project carried out at the Hochschule der Akademie der Polizei Hamburg, using virtual reality technology, and the initial evaluation results.

1 Bias crime and how the police handle them in Germany1This section builds on an earlier publication by the first author (Groß & Häfele, 2021) and contains text fragments that have already been published there.

Bias-based acts2From a criminological perspective, the term bias crime is more appropriate than hate crime, especially as the acts are an expression of group-related devaluation and discrimination (group-focused misanthropy) or negative prejudices against social groups that are linked to social structures of power and oppression, see also Fuchs, 2021, p. 270.  and hate crimes are major issues when it comes to promoting diversity and inclusivity in many European countries. These acts specifically target individuals based on their social group membership and are based on protected characteristics, such as skin color, religious beliefs, or sexual orientation, under constitutional law (Groß & Häfele, 2021). The concept of bias crime (BC) is closely linked to the concept of group-focused enmity (Gruppenbezogene Menschenfeindlichkeit, GMF) (Heitmeyer, 2002), both of which revolve around the prejudiced belief in the inequality of different population groups (ideology of inequality) (Heitmeyer, 2002; Zick, Küpper & Heitmeyer, 2009; Zick, Wolf, Küpper, Davidov, Schmidt & Heitmeyer, 2008). This inequality can manifest in discriminatory attacks and hostile acts, such as racism, anti-Semitism, or transphobia. Therefore, the concept of bias crime theoretically covers the territory where prejudiced attitudes turn into concrete actions (Zick & Küpper, 2021). Although GMF and bias crime share a similar foundation, they are not identical. GMF focuses on attitudes, while bias crime deals with actions. In her study, Krieg (2022) was able to empirically prove the theoretical connection between GMF attitudes and the commission of discriminatory behavior, with a representative sample of students (N= 2824). What makes bias-motivated crime especially concerning is that, according to the logic of GMF, victims are not just targeted as individuals but also because of their group membership, either actual or perceived. These crimes not only harm the direct victim but also send a threatening message to the entire group to which the victim belongs (Groß & Häfele, 2021; Williams, 2021), making them damaging on a wider scale and threatening to democracy.

Since 2001, the police have officially recorded these crimes as “hate crimes” (Lang, 2014, p. 54). This includes crimes directed against a person or group of people due to their political stance, attitude and/or commitment, nationality, ethnicity, skin color, religious affiliation, ideology,or social status, physical and/or mental disability and/or impairment, gender/sexual identity, sexual orientation,or physical appearance (BKA, 2023). These acts can be directly committed against a person or group of persons, an institution, or an object/thing associated with one of the aforementioned social groups (actual or perceived affiliation), or against any target in connection with the perpetrator’s biases (BKA, 2023b). In Germany, as well as other European countries, there was a significant increase in officially reported hate crimes between 2014 and 2018 (Riaz et al., 2021). For the year 2022, 11,520 offenses were recorded in Germany, representing a 10% increase from the previous year (2021) (BMI & BKA, 2023, p.1 0). The few available studies on the unreported cases of bias crime in Germany indicate a very high number of unreported cases, ranging from 50% to 90% (e.g. Church & Coester, 2021; Fröhlich, 20213https://stadt.muenchen.de/dam/jcr:c19e83da-eca8-48b0-920e-e6e37791d4e7/Kurzfassung_DRUCK_final.pdf; Groß, Dreißigacker & Riesner 2019). 

Just like the GMF concept, the BC concept and its reporting in Germany is continuously evolving to reflect social debates and developments. Since 2017, the new version of the police recording system includes “gender/sexual identity, sexual orientation” instead of the restrictive “sexual orientation” category, explicitly including trans* individuals in the police counts. In addition, the category of “race” was removed from the police recording system, and “physical and/or mental impairment” was added in 2017 (Groß & Häfele, 2021). Since 2017, law enforcement authorities have also been required to consider the perspective of the victim, among other aspects, when assessing the circumstances of the crime (Kleffner, 2018, p. 35).

Although the German police counting system is theoretically well-positioned, there are indications of quality gaps in the recording and perception of bias-motivated crime in police practice (e.g. Kleffner, 2018: 36; Habermann & Singelnstein, 2018; Groß & Häfele, 2021). There are significant differences in the statistics between those recorded by the police and those reported by independent counseling centers, suggesting an underestimation of bias crime based on police counts (e.g. Lang, 2015). For instance, specialized counseling centers registered approximately one third more acts of right-wing violence in 2017 than law enforcement agencies and offices for the protection of the constitution. When considering homicides linked to right-wing and racist violence as subcategories of the BC, the discrepancy between civil society and security authority counts is particularly significant, as noted in a comprehensive study of right-wing political homicides in Berlin from 1990 to 2008 (Feldmann et al., 2018). There could be various reasons for this high discrepancy. Errors in the initial recording by the police can lead to distortions. In addition, although some cases are reported to victim support services, they are not reported to the police. In a recent study4Groß, E., Häfele, J. & Peter S. (i.E.). Kernbefundebericht zum Projekt HateTown., it was found that the most common reasons for not reporting bias-motivated crime were fear that it wouldn’t make a difference, that the police wouldn’t be able to solve the case, and concern about not being taken seriously. These frequently cited reasons suggest a lack of trust in the police by those affected. Empirical evidence also supports this assumption, as victims of bias-motivated crime have been shown to have the least trust in the police compared to victims of non-bias-motivated crime and non-victims. This lack of trust is likely a key reason for the high estimated level of unreported crime in Europe.

Existing research in Germany confirms that many victims of bias crime feel that they are not taken seriously enough by the police or are treated inappropriately. This may be due to a lack of empathy, understanding, and proper training for police officers in dealing with bias crime. Unfortunately, this is not unique to Germany and is also seen in an international comparison. In 2018, the Greater Manchester Police (GMP) in England attempted to improve police officers’ sensitivity and professionalism towards those affected by implementing virtual reality training. This is where the VR headset-based training, aptly named “Affinity”, comes into play.

2 RESEARCH OBJECTIVE AND METHODS OF THE PILOT PROJECT: SENSITIZE POLICE OFFICERS TO THE EXPERIENCES OF VICTIMS OF BIAS CRIME

This virtual reality-based training, developed in cooperation with Mother Mountain Productions CIC (MMP) and Greater Manchester Police (GMP), allows officers to experience bias crime in a virtual setting by putting them in the role of victims in an immersive5“Immersive” means “to dive into” or “to immerse oneself in something” and refers to the effect that occurs in the consumer when they “dive into” the virtual or fictional world and it feels as if the experience is real and they are actually part of it. environment. Real-life stories of individuals who have faced anti-Semitism, trans-hostility, and attacks as people with disabilities are recreated using 3D technology and experienced through virtual goggles. The goal is to help officers understand how their initial reactions and language can affect victims and recognize stereotypes and justifications used by perpetrators in these types of crimes. The purpose of the training is to establish a connection and empathy with victims, in order to build trust in the police and increase reporting rates. Data collected so far6MMP & GMP, 2023. has shown that VR training has a significant and long-term effectiveness in changing attitudes and behavior among police officers, and also reinforces these changes among those who have already been sensitized.

As part of the “Immersive Democracy Project”7The project immersive democracy is part of the European Metaverse Research Network with members in the Netherlands, Poland, Spain, Italy, Sweden and France. The aim is to conduct research that demonstrates both the liberalizing and highly beneficial opportunities of the metaverse idea as well as the potential risks and challenges associated with it. Further information on the project can be found at https://metaverse-forschung.de/en/about-the-project/ [17.10.2023], the European Metaverse Research Network is presented at https://www.metaverse-research-network.info [17.10.2023]., a quasi-experimental VR-based affinity training was conducted on August 21 and 22, 2023 at the Hochschule der Akademie der Polizei Hamburg in the Department of Sociology for future police officers.8This was followed by a one-day expert workshop with representatives from the police, academia, victim protection organizations, police students and teachers. The VR experience included original scenarios developed in collaboration with the GMP. The students were shown three English-language films where they experienced discrimination as victims of different marginalized groups, such as anti-Semitism, transphobia, and assault on a visually impaired person. The participants also experienced being questioned by the police after the incident from the perspective of the victim. The films were accompanied by re-enacted interviews with the actual victims describing their experiences during the assault and with the police.

The response patterns of the participating police students were measured through scenario-based questions designed to assess empathy before and after using the VR goggles as part of the Affinity project.9Specifically, the “Toronto Empathy Questionnaire (TEQ)”, a measurement method developed by Spreng et al. (2009), was used in the scenario-related development of the question formulation. It is a one-dimensional, concise and valid instrument for assessing empathy. As empathy itself counts as a comparatively stable characteristic, related behavioral and scenario-related questions were formulated that are suitable for a before-and-after measurement. The results of the quasi-experimental10In addition to the lack of randomization of the experimental group, no parallel control group could be investigated; the experimental element in the setting is therefore exclusively the treatment and the before-after measurement of the dependent variables. study on the changes in the participants’ attitudes towards victims of bias-motivated crimes are presented in the following section.

3 RESULTS

25 third-semester students from the police protection program, who were not randomly selected, took part in the pilot project. Out of them, 21 students completed the closed questionnaire before and after the VR application. The quantitative analysis is based on these 21 cases, and is not representative of the entire population of police students at the Hochschule der Akademie der Polizei Hamburg. Also, no control group was included in the study, and the lack of randomization only allows for a weak quasi-experimental design. Despite these methodological limitations, the comparison of the response patterns in the dependent variables before and after the VR application provides initial insights into the changes in the attitudes of the police students towards victims of bias-motivated crimes and the prevalence of such crimes.

Figure 1 shows the difference in the response patterns of the 21 students before and after the application to the statement: “Hate crime should be a priority in police work.”

Abbildung 1: Differenz in dem Antwortmuster der 21 Studierenden vor und nach der der VR-Anwendung zu folgender Aussage: „Hasskriminalität sollte bei der Polizeiarbeit Priorität haben.“

Figure 1: The difference in response patterns of the 21 students before and after the VR application to the statement: “Hate crime should be a priority in police work.”

Among the 21 cases, a strong agreement on the importance of prioritizing hate crimes in police work was clearly seen, with three cases showing an increase in agreement, while the other two categories showing a decrease. This suggests that the students became more attuned to the issue of hate crimes after the VR application.

The statement, “Victims of hate crime should be more resilient and able to deal with the situation without reporting it to the police,” lost overall agreement among students after the VR application. However, the rejection of this statement increased (see Figure 2), indicating a higher sensitivity towards victims of bias-motivated crime.

Abbildung 2: Differenz in dem Antwortmuster der 21 Studierenden vor und nach der der VR-Anwendung zu folgender Aussage: „Opfer von Hassverbrechen sollten widerstandsfähiger sein und in der Lage, mit der Situation umzugehen, ohne sie der Polizei zu melden. 

Figure 2: Difference in the response pattern of 21 students before and after the VR application to the following statement: “Victims of hate crime should be more resilient and able to deal with the situation without reporting it to the police.”

The statement following the VR application gained approval: “I believe that my way of interacting with a victim of a bias crime can influence that person’s ability to deal with what has happened.” (see Figure 3).

Abbildung 3: Differenz in dem Antwortmuster der 21 Studierenden vor und nach der der VR-Anwendung zu folgender Aussage: „Ich glaube, dass meine Art der Interaktion mit einem Opfer einer vorurteilsgeleiteten Tat die Fähigkeit dieser Person beeinflussen kann, mit dem Geschehenen umzugehen.“


Figure 3: Difference in the response pattern of 21 students before and after the VR application to the following statement: “I believe that my way of interacting with a victim of a bias-motivated crime can influence that person’s ability to deal with what has happened.”

As intended, this result indicates an increased sensitivity towards one’s own behavior as a police officer as a result of the VR application, and therefore, an increased reflection on one’s own behavior in the investigated situations. It can be assumed that the VR application leads to an increased sense of self-efficacy as a police officer when dealing with the sensitivities of victims of bias-motivated crime.

The change in the response behavior of the 21 students to the following statement points in a very similar direction: “The way I deal with a victim has the potential to improve the victim’s experience.” (see Figure 4).

Abbildung 3: Differenz in dem Antwortmuster der 21 Studierenden vor und nach der der VR-Anwendung zu folgender Aussage: „Ich glaube, dass meine Art der Interaktion mit einem Opfer einer vorurteilsgeleiteten Tat die Fähigkeit dieser Person beeinflussen kann, mit dem Geschehenen umzugehen.“

Figure 4: Difference in the response pattern of 21 students before and after the VR application to the following statement: “The way I deal with a victim has the potential to improve the victim’s experience.”

Overall, the collected quantitative data indicate an effect of the VR application in the intended direction for the Hamburg students, similar to what MMP and GMP found in their study of police officers from Manchester after 3 years (MMP & GMP, 2023).

In a qualitative design, the students were asked to reflect on their experience with the Affinity application and to share their thoughts and feelings. Many participants had a positive experience, as shown in the following examples:

  • “It felt like I was there myself.”
  • “I was directly engaged.”
  • “I paid more attention compared to a movie, where I usually look away and get distracted.”
  • “This made it easier to understand how people feel.”
  • “This method is a good way to gather people’s points of view, which is important when they come into contact with the police. I immediately felt a tension that I’m sure those affected also feel, as they know it could happen again.

4 SUMMARY AND OUTLOOK

Hate crimes and bias-based crimes do not only affect individual victims, but also have a broader impact on corresponding social subgroups and the quality of democracy as a whole. Studies show that victims of hate crimes often do not report them to the police due to low trust. It must be in the interest of the police to undergo appropriate training and education to ensure that vulnerable victims, such as those affected by hate crimes, feel comfortable turning to the police for support. This could also help address the issue of unreported cases in this area.

To address this, the virtual reality-based training program “Affinity” was developed by MMP in collaboration with Greater Manchester Police. This training allows police officers, including trainees, to experience real-life stories of bias crime in an immersive environment. They can take on the perspective of the victims and witness the effects of verbal and non-verbal communication through the reactions of the interviewing officers. Additionally, they learn about stereotypical justifications associated with these acts, which helps them better identify and classify them. The training also aims to improve empathy among police officers. Initial data from England shows that this training has a high and lasting effectiveness on changes in attitude and behavior among police officers and reinforces the sensitivity of those already aware of the issues associated with bias crime.

Similarly, results from a pilot project at the Hochschule der Akademie der Polizei Hamburg with 25 police students also show the effectiveness of this training. The students strongly agreed that hate crime is a priority for police work, and their perceived resilience decreased while their awareness of their responsibility as police officers in dealing with hate crime increased. After the VR experience, they were also more likely to believe that their interactions with victims of bias crime can influence how they handle the crime and how they perceive and evaluate it in general. Thus, overall, a positive effect was also evident among police students in Hamburg – although the group of individuals tested was very small due to the pilot nature of the project, and thus the results cannot be generalized.

It should be noted that a simulated VR experience cannot truly replicate the perspective of those affected by bias crime and racism. However, other methods used in police training, such as role-playing or training with colleagues, have similar limitations, if not worse. The disconnection from the environment that occurs in the VR experience, due to the combination of 3D glasses and headphones, results in participants experiencing the simulation more intensely compared to other training courses. This was reflected in the comments of the students immediately after the VR experience.

Overall, this suggests that VR training in immersive environments holds promise in sensitizing police officers to dealing with victims of hate crime. The increased sensitivity of police officers may also lead to greater trust in the police and a higher likelihood of crime reporting by victims. In Germany, where different training concepts are used for prospective police officers compared to England, further research is necessary to assess the effectiveness of VR-based training with larger groups, using control group designs and longitudinal approaches. Additionally, scenarios designed specifically for Germany, including language, clothing, and environment, should be developed to make the experience as realistic as possible.

BKA (2023, 9. Mai). Politisch motivierte Kriminalität 2022 – Vorstellung der Fallzahlen in gemeinsamer Pressekonferenz. Bundeskriminalamt. https://www.bka.de/SharedDocs/Kurzmeldungen/DE/Kurzmeldungen/230509_PMK_PK.html (Zugegriffen: 08.08.2023).

BKA (2023b). Politisch motivierte Kriminalität (PMK) -rechts- Phänomen – Definition, Beschreibung, Deliktsbereiche. Bundeskriminalamt. https://www.bka.de/DE/UnsereAufgaben/Deliktsbereiche/PMK/PMKrechts/PMKrechts_node.html (Zugegriffen: 08.08.2023).

Church, D., & Coester, M. (2021). Opfer von Vorurteilskriminalität. Thematische Auswertung des Deutschen Viktimisierungssurvey 2017. Forschungsbericht. Kriminalistisches Institut, Kriminalistisch-kriminologischen Forschung, BKA.

Feldmann, D., Kohlstruck, M., Laube, M., Schultz, G., & Tausendteufel, H. (2018). Klassifikation politisch rechter Tötungsdelikte – Berlin 1990 bis 2008. Universitätsverlag der TU Berlin.

Fröhlich, W. (2021). Hasskriminalität in München. Vorurteilskriminalität und ihre individuellen und kollektiven Folgen. Kurzfassung der Studie des Sozialwissenschaftlichen Instituts München. https://stadt.muenchen.de/dam/jcr:c19e83da-eca8-48b0-920e-e6e37791d4e7/Kurzfassung_DRUCK_final.pdf (Zugegriffen: 25.10.2023).

Fuchs, W. (2021). Vorurteilskriminalität – Konzept, Auswirkungen auf Opfer, Rechtsgrundlagen und verbesserte statistische Erfassung. Journal für Strafrecht, 8(3), 279-294. https://doi.org/10.33196/jst202103027901

Groß, E., & Häfele. J. (2021). Vorurteilskriminalität. Konzept, Befunde und Probleme der polizeilichen Erfassung. Forum Politische Bildung und Polizei, 1/2021, 20-30.

Groß, E., Dreißigacker, A., & Riesner, L. (2019). Viktimisierung durch Hasskriminalität. Eine erste repräsentative Erfassung des Dunkelfeldes in Niedersachsen und in Schleswig-Holstein. Wissen schafft Demokratie, 4, 140-159. https://doi.org/10.19222/201804/13

Habermann, J., & Singelnstein, T. (2018). Praxis und Probleme bei der Erfassung politisch rechtsmotivierter Kriminalität durch die Polizei. Wissen schafft Demokratie, 4, 20-31. https://doi.org/10.19222/201804/02

Heitmeyer, W. (2002). Gruppenbezogene Menschenfeindlichkeit. Die theoretische Konzeption und erste empirische Ergebnisse. In Heitmeyer, W. (Hrsg.), Edition Suhrkamp: Vol. 2290. Deutsche Zustände. Folge 1 (S. 301). Suhrkamp Verlag.

Kleffner, H. (2018). Die Reform der PMK-Definition und die anhaltenden Erfassungslücken zum Ausmaß rechter Gewalt. Wissen schafft Demokratie, 4, 30-37. https://doi.org/10.19222/201804/03

Krieg, Y. (2022). The Role of the Social Environment in the Relationship Between Group -Focused Enmity Towards Social Minorities and Politically Motivated Crime. Kölner Zeitschrift für Soziologie und Sozialpsychologie 74, 65-94. https://doi.org/10.1007/s11577-022-00818-7

Lang, K. (2014). Vorurteilskriminalität. Eine Untersuchung vorurteilsmotivierter Taten im Strafrecht und deren Verfolgung durch die Polizei, Staatsanwaltschaft und Gerichte. Nomos-Verlag.

MMP (Mother Mountain Productions CIV) & GMP (Greater Manchester Police) (2023). Affinity. VR Hate Crime Training Program 2020 – 2023 [Impact Review Präsentation]. Workshop in Hamburg 2023.

Riaz, S., Bischof, D., & Wagner, M. (2021). Out-group Threat and Xenophobic Hate Crimes: Evidence of Local Intergroup Conflict Dynamics between Immigrants and Natives. OSF Preprints. https://doi.org/10.31219/osf.io/2qusg

Spreng, R., McKinnon, M., Mar, R., & Levine, B. (2009). The Toronto Empathy Questionnaire: scale development and initial validation of a factor-analytic solution to multiple empathy measures. Journal of Personality Assessment. 91(1), 62 -71. https://doi.org/10.1080/00223890802484381

Williams, M. (2021). The Science of Hate: How prejudice becomes hate and what we can do to stop it. Faber & Faber.

Zick, A., & Küpper, B. (2021). Die geforderte Mitte. Rechtsextreme und demokratiegefährdende Einstellungen in Deutschland 2020/2. Friedrich-Ebert-Stiftung von Franziska Schröter. Dietz.

Zick, A., Küpper, B., & Heitmeyer, W. (2009). Prejudices and group-focused enmity – a socio-functional perspective. In A. Pelinka, A., K. Bischof, K., & K. Stögner, K. (Hrsg.), Handbook of Prejudice (S. 273-303). Cambria Press.

Zick, A., Wolf, C., Küpper, B., Davidov, E., Schmidt, P., & Heitmeyer, W. (2008). The Syndrome of Group -Focused Enmity: The Interrelation of Prejudices Tested with Multiple Cross-Sectional and Panel Data. Journal of Social Issues, 64(2), 363-383. https://doi.org/10.1111/j.1540-4560.2008.00566.x


[1] This section builds on an earlier publication by the first author (Groß & Häfele, 2021) and contains text fragments that have already been published there.

[2] From a criminological perspective, the term bias crime is more appropriate than hate crime, especially as the acts are an expression of group-related devaluation and discrimination (group-focused misanthropy) or negative prejudices against social groups that are linked to social structures of power and oppression, see also Fuchs, 2021, p. 270. 

[4] Groß, E., Häfele, J. & Peter S. (i.E.). Kernbefundebericht zum Projekt HateTown.

[5] 

“Immersive” means “to dive into” or “to immerse oneself in something” and refers to the effect that occurs in the consumer when they “dive into” the virtual or fictional world and it feels as if the experience is real and they are actually part of it.

[6] MMP & GMP, 2023.

[7] The Project Immersive Democracy is part of the European Metaverse Research Network with members in the Netherlands, Poland, Spain, Italy, Sweden and France. The aim is to conduct research that demonstrates both the liberalizing and highly beneficial opportunities of the metaverse idea as well as the potential risks and challenges associated with it. Further information on the project can be found at https://metaverse-forschung.de/en/about-the-project/ [17.10.2023], the European Metaverse Research Network is presented at https://www.metaverse-research-network.info [17.10.2023].

[8] This was followed by a one-day expert workshop with representatives from the police, academia, victim protection organizations, police students and teachers.

[9] Specifically, the “Toronto Empathy Questionnaire (TEQ)”, a measurement method developed by Spreng et al. (2009), was used in the scenario-related development of the question formulation. It is a one-dimensional, concise and valid instrument for assessing empathy. As empathy itself counts as a comparatively stable characteristic, related behavioral and scenario-related questions were formulated that are suitable for a before-and-after measurement.

[10] In addition to the lack of randomization of the experimental group, no parallel control group could be investigated; the experimental element in the setting is therefore exclusively the treatment and the before-after measurement of the dependent variables. 

  • 1
    This section builds on an earlier publication by the first author (Groß & Häfele, 2021) and contains text fragments that have already been published there.
  • 2
    From a criminological perspective, the term bias crime is more appropriate than hate crime, especially as the acts are an expression of group-related devaluation and discrimination (group-focused misanthropy) or negative prejudices against social groups that are linked to social structures of power and oppression, see also Fuchs, 2021, p. 270. 
  • 3
    https://stadt.muenchen.de/dam/jcr:c19e83da-eca8-48b0-920e-e6e37791d4e7/Kurzfassung_DRUCK_final.pdf
  • 4
    Groß, E., Häfele, J. & Peter S. (i.E.). Kernbefundebericht zum Projekt HateTown.
  • 5
    “Immersive” means “to dive into” or “to immerse oneself in something” and refers to the effect that occurs in the consumer when they “dive into” the virtual or fictional world and it feels as if the experience is real and they are actually part of it. 
  • 6
    MMP & GMP, 2023.
  • 7
    The project immersive democracy is part of the European Metaverse Research Network with members in the Netherlands, Poland, Spain, Italy, Sweden and France. The aim is to conduct research that demonstrates both the liberalizing and highly beneficial opportunities of the metaverse idea as well as the potential risks and challenges associated with it. Further information on the project can be found at https://metaverse-forschung.de/en/about-the-project/ [17.10.2023], the European Metaverse Research Network is presented at https://www.metaverse-research-network.info [17.10.2023].
  • 8
    This was followed by a one-day expert workshop with representatives from the police, academia, victim protection organizations, police students and teachers.
  • 9
    Specifically, the “Toronto Empathy Questionnaire (TEQ)”, a measurement method developed by Spreng et al. (2009), was used in the scenario-related development of the question formulation. It is a one-dimensional, concise and valid instrument for assessing empathy. As empathy itself counts as a comparatively stable characteristic, related behavioral and scenario-related questions were formulated that are suitable for a before-and-after measurement.
  • 10
    In addition to the lack of randomization of the experimental group, no parallel control group could be investigated; the experimental element in the setting is therefore exclusively the treatment and the before-after measurement of the dependent variables.

Virtual realities, real participation. Challenges and opportunities of public participation in the metaverse

1 INTRODUCTION: THE METAVERSE HYPE IS OVER – NOW WHAT?

In 2021, Meta Platforms presented its vision of the metaverse and promptly sparked a polarised debate about the future of digital environments (Dolata & Schwabe, 2023). The decision to change the company’s name from Facebook to Meta was also announced during this presentation, thus making it clear to all those present and other interested parties: Facebook and its founder were serious about virtual environments. Perhaps this was also what the company intended: namely, to interrupt the long-running debate about Instagram’s platform mechanisms and their detrimental effect on health, and the debate around Facebook’s role in calls to violence with fatal outcomes in India. In this complex and very tense situation for the company, the decision was taken to rebrand, thereby signalling to the tech industry that the future of the Internet lay in the metaverse. The surprise succeeded, as journalists and technology enthusiasts proceeded to debate Web3 and the potential of immersive systems for months afterwards. 

Meta did not invent the term or the immersive environment. The term metaverse was first used by author Neal Stephenson in his 1992 science fiction novel “Snow Crash” (Stephenson, 1992), in which the vision of an enhanced and virtual environment gradually becomes reality. The metaverse is described as a hypothetical, immersive and interactive virtual space that is intended to represent a next generation of the Internet (Dwivedi et al., 2022; Xi et al., 2023). Currently, a number of virtual worlds exist, provided and organised by various platform operators under the collective term metaverse. What these environments have in common is that – regardless of whether they are used in a gaming context or an e-commerce context – they consist of a variety of virtual spaces, objects and entities and can be accessed by users via a range of different devices. The metaverse is thus an extension of the concept of immersive systems. It does not describe an individual technology, but rather the goal of creating a seamless, interconnected digital world that lets the boundaries between reality and virtuality become blurred and enables new forms of social interaction, trade and entertainment (Dwivedi et al., 2022). The metaverse is thus described, but not the technologies of which it is composed. XR is used as an umbrella term covering all immersive technologies that aim to expand the human perception of reality. XR technologies include AR, VR and everything in between (Xi et al., 2023). Other authors criticise the fuzzy definition of XR and suggest XR should not be understood as an abbreviation of extended reality but of xReality, in which X represents any digital reality (Rauschnabel et al., 2022). Dwivedi et al. (2022) argue that, in the context of the metaverse, immersion can be achieved either through AR or through VR, but not both at the same time. While simultaneous use is still unresolved, the AR and VR hardware market is changing rapidly and enabling more and more mixed formats. Various devices were used in the past and are still used today for AR and VR, such as head-mounted displays, smartphones and tablets, and so it is likely that the future of XR hardware will allow both realities to intertwine. Hardware like Apple Vision Pro blurs boundaries between the physical and digital worlds by seamlessly connecting real and virtual environments. Meta did therefore succeed in introducing the term metaverse into the mainstream market; however, the process of developing XR technologies and platform infrastructure is proving more difficult than expected, meaning that the first wave of hype around the metaverse appears to have already died down (Robinson, 2023). Accordingly, we find ourselves in a situation in which it is foreseeable that the metaverse will be a future of digital environments, but the path to get there will be rockier than many expected. This period of development should be used as an opportunity to think about what the metaverse can and should be used for. 

The discussion around the metaverse in 2021 is reminiscent of debates about social networks like Facebook and Twitter at the beginning of the 2010s. Their use also unleashed great expectations and concerns, and deliberative powers were also attributed to them, particularly in the course of the Arab Spring (Wolfsfeld et al., 2013). Over time, social media platforms have turned out to be not only deliberative discourse platforms but services where problematic aspects of social discourse – such as hate, exclusion and disinformation – are also found, and the platform design of social networks plays no small part in this. Their mechanisms are based on principles of the platform economy, in which there are customers who should not be confused with users. Their customers are advertisers, for whom the data-driven platforms offer new opportunities for personalised advertising and sales. And thus the expectations of the metaverse have also been strongly driven by economic interests up to now. Research into immersive systems also seems to focus largely on e-commerce issues. However, to ensure that similar mistakes are not made when designing this new environment, it is important to highlight opportunities for civil society and political use in order to set out early on how democratic entities, states and also smaller units like cities and local communities could engage in the metaverse. One possible vision of use is for immersive systems to be used for citizen participation, e.g. in urban planning. Demand for digital participation offerings has been increasing in recent years, in part due to the coronavirus pandemic  (United Nations, 2020). How these offerings could be combined with immersive systems and thus used in a metaverse context will be discussed below.

Back in the 1990s, Lombard and Ditton (1997) found that telepresence and immersion can lead to a greater sense of involvement and engagement among users. This finding is, to some extent, groundbreaking for dealing with XR in public participation. A technology that succeeded in inspiring people’s enthusiasm in the first scientific experiments and manages to do so in the context of video games in particular seems highly relevant for public participation. 

In Reconstructing Democracy (2020), Taylor, Nanz and Taylor emphasise the importance of local participation in giving citizens opportunities to make their concerns heard, including in representative democracies. In times of major transformation in particular, the authors see the necessity of including citizens in change processes (e.g. the energy and mobility sector). The village of Langenegg in Austria is cited as a positive example. As a rural area very much affected by a general population decrease, the public authority succeeded in involving citizens in creating a vision for the future of the region. This long-running consultation and cooperation process helped to keep the village attractive for its residents and, contrary to forecasts, even positively affected the general population development (Statistik Austria, 2021; Taylor et al., 2020). Langenegg is mentioned as an anecdotal example of the positive impact of participation projects. In contrast to this are the bigger debates about citizen participation in Germany. These often relate to problem cases where participation projects have been initiated in order to resolve conflict situations caused by democratic deficits. One such example is the debate around the rebuilding of Stuttgart’s main railway station as part of the Stuttgart 21 infrastructure project. Remodelling of the station began in 2010 when the building was partially demolished, which led to numerous protests. The conflict could only be resolved through an arbitration process and a referendum. As a result of this case, the general public across the nation became very aware of the implications and cost dimension of a lack of early involvement in construction planning (Brettschneider, 2013). The clients’ failure to communicate properly is cited as a reason that triggered this conflict. For example, construction plans were not made available locally for people to view, which meant the implementation plans caused confusion once construction began (Thaa, 2013). Lack of inclusion and non-transparent communication like this can damage confidence in politicians and local authorities. The conflict in Stuttgart was de-escalated through a public referendum; nevertheless, this avoidable conflict has had a long-lasting effect on construction planning.

Despite the genesis of this prominent case of problematic non-participative construction planning, a recurring criticism of participation processes relates to their timing, organisational and financial aspects. Participation processes are viewed as costly, lengthy and, from the initiators’ perspective, complex and difficult to control. However, the obvious knowledge gap between the initiators of building projects and the citizens affected by these projects makes it necessary to look for new, easy-to-implement approaches here. Since the 2010s, providers like Consul, Liquid Democracy, LiquidFeedback, CitizenLab and Zebralog have established themselves in the field of digital public participation, also referred to as e-participation (Macintosh, 2004) and digital citizen participation (Fegert, 2022). The providers have developed modular systems, some of which are open source, and aim to enable participation procedures and in some cases also voting procedures using digital web platforms. Compared to traditional outreach and/or postal participation methods, these have a relatively low implementation threshold. Digital participation projects are therefore praised for their relative cost efficiency (Spirakis et al., 2010). Nevertheless, these platforms had a relatively niche existence in the public sphere until the Pirate Party started using LiquidFeedback to organise itself. The Covid-19 pandemic also prompted a rethink here: when German parties were suddenly forced to use appropriate platforms to enable the political participation of their members and delegates at party conferences, the practical benefits and efficiency of these were brought to the attention of a wider public. Rottinghaus and Escher (2020) and Novo Vázquez and Vicente (2019) show, however, that general political interest and individual concern about the subject matter remain the crucial motivating factors for public participation using digital resources. In addition, other researchers give evidence of sex-specific differences in the use of e-participation offerings (Kim & Lee, 2019). There are thus also certain groups and milieus online that participate primarily digitally. Usability issues can be cited as a reason for this. Here, the current tools do not appear to be state of the art as regards user friendliness (Fegert et al., 2021). While big tech platforms for e-commerce or social media continually test and refine their usability in the background in order to avoid usage problems that might scare off users, e-participation falls short of its potential in this regard. For this reason too, it is important to look for new approaches to digital participation that boost people’s willingness and motivation to participate. Think tank Democracy Technologies predicts that market volume in the field of digital public participation will grow from €100 million in 2022 to €300 million in the next five years (Democracy Technologies, 2023). Cities and local authorities thus appear to be slowly getting ready for online participation processes, and so the task now is to design these according to use interests. 

To enable public participation in local decision-making processes, it is necessary to create easy-to-understand visualisations as well as reliable and user-friendly feedback mechanisms. Here, the use of immersive systems has scope to effectively expand participation platforms. Even if none of the participation platforms currently use immersive systems on their platforms, their use has been researched for a long time. This state of research is not comparable with the resources channelled into researching immersive systems for industrial and commercial contexts, but some of the research results from these other contexts can be transferred to participation with immersive systems. For example, Suh and Lee (2005) were able to show quite early on that VR use left users with a higher level of knowledge about products. My own studies for the participation context have confirmed this. Funded by the Federal Ministry of Education and Research (BMBF), the FZI Research Center for Information Technology has created the House of Participation and is currently carrying out a second research project on the use of immersive systems in digital citizen participation. Here, immersive participation applications have been and are currently being developed and researched in the projects Take Part (2018–2021) and VIRTUS (2021–2024), also with the aim of critically evaluating the marketing potential of such use. In line with participative technology development, standards for such platforms are developed and defined together with citizens and building planners. The prototypes were then evaluated and refined in various studies with different methodological approaches, one of the aims being to give e-participation platform operators, cities, local authorities and building planners suggestions for the use of immersive participation. Examples of results are shown below: 

In a qualitative interview study (n=27) carried out in 2018 as part of a specific construction planning process in a major German city, we observed a high level of interest among very different stakeholders in the idea of using immersive systems for citizen participation (Fegert et al., 2020). Overall, our study showed strong appreciation for the idea of using digital technologies for public participation: a large majority of the study participants found that digital technologies were a valuable complement to public participation and expected that digital technologies would promote access to information about public building projects. Accordingly, two thirds of the participants preferred 3D visualisation of building projects to traditional architectural plans. We proved that the future design of the application should support this process by harnessing the potential of VR- and AR-based visualisations in order to include citizens in the planning of public building projects and minimise the gap in knowledge between them and experts (Fegert et al., 2020). 

In a field study carried out in 2019 (n= 339) we showed that immersive systems, particularly VR, significantly help to stimulate the imagination of individuals with regard to building sites and urban planning and thus can be important support for digital participation processes. Immersive systems can concretise and improve spatial understanding in e-participation; however, significant differences between AR and VR become apparent here, with VR performing significantly better than was the case for AR (see Figure 1, Fegert, 2022). 

Ein Bild, das Text, Screenshot, Zahl, Diagramm enthält.

Automatisch generierte Beschreibung

Figure 1: Information content of visualisation measured with a 7-point Likert scale, in which 7 indicates strong 

We showed that immersive components allow users to better assess the proportions of a building project. This result was verified in our study by an estimation question: here, immersion helped people correctly classify proportions, thus demonstrating that the use of immersive systems in participative construction planning actually enhances spatial understanding (Fegert et al., 2020). AR and VR also have the ability to lower the participation threshold by motivating citizens to take part. The quantitative field study also showed that immersive systems have a positive effect on willingness to engage with building projects and on willingness to contribute to these. We were thus able to demonstrate the relevance for other participation concepts, such as participatory budgeting. The qualitative and quantitative studies also made it possible to identify criteria for the design of an immersive participation platform. We asked ourselves how an immersive AR- and VR-based digital participation platform for urban and construction planning should be designed for the purpose of increasing citizens’ willingness to get involved. Based on the studies, design principles were formulated and refined. They are presented in brief here (Fegert, 2022): platform development for immersive e-participation in urban planning should give consideration to accessibility (1). An immersive participation environment should be easy to navigate and work on different devices. The design principle of information quality (2) emphasises the importance of hardware-appropriate forms of visualisation, which should play to the strengths of various technologies and immersion levels. For example, while VR tends to be perceived as an entertainment medium, AR is considered more effective for conveying information. Motivation (3) represents another design principle. When designing such environments, everything should be done to involve users at all stages. From updates on the progress of the participation project to incentives such as badges or other elements of gameful design, efforts should be made to continuously maintain the interest of users. In immersive environments too, it is very important to users that people interact respectfully when participating. Fear of hate speech is a factor for immersive environments as well, and this can prevent citizens from getting involved. As a design principle, transparency (4) aims to ensure something that poses a challenge in less text-intensive immersive environments: namely to represent the process-related participation setting, among other things. For instance, this means relying much more on shorter text blocks or video content in immersive environments in order to communicate, for example, how binding a consultation or decision-making process is. The final design principle is data protection and data sovereignty (5). E-participation applications mostly adhere to the data minimisation principle, so when it comes to participation in immersive environments it is also true that users largely want to participate anonymously. This is contrary to the use of personalised avatars, as used in the metaverse. Above all, when selecting and configuring the hardware, close attention must be paid to how this is set up to ensure that the data which is often collected, like eye tracking, is not recorded in a personalised manner. 

In addition to this long-term examination of immersive participation platforms in government funded research projects, a number of student projects very creatively demonstrate what can be implemented with simpler approaches. For example, in 2020 Paulina Porten developed Augmented Participation: an exciting and well designed participation tool that combines voice messages with immersive presentations (see Figure 2, Paulina Porten, 2020).

In addition to this long-term examination of immersive participation platforms in government funded research projects, a number of student projects very creatively demonstrate what can be implemented with simpler approaches. For example, in 2020 Paulina Porten developed Augmented Participation: an exciting and well designed participation tool that combines voice messages with immersive presentations (see Figure 2, Paulina Porten, 2020).

Ein Bild, das Handy, Text, tragbares Kommunikationsgerät, mobiles Gerät enthält.

Automatisch generierte Beschreibung
Figure 2: Augmented Participation application (Paulina Porten, 2020).

After this overview of participation using immersive systems, study results on the use of the metaverse for immersive participation will be used below. 

The announcements made by Meta Platforms in October 2021 led, as mentioned above, to increased public interest in immersive systems. Consequently, within the VIRTUS research project, it made sense to reassess the relevance of the combination of immersive systems and public participation in the context of the metaverse. Therefore, at the beginning of 2022, a further qualitative interview study (n=14) was carried out to assess the effects of the Meta platform launch on the relevance of the combination of immersive systems and public participation. In another major German city, stakeholders involved in public construction projects were again spoken to and specifically asked about their knowledge and concerns regarding the metaverse. Some of the results are reported below.

  •  Most participants had little to no knowledge of the metaverse. Those who were aware of the metaverse had heard about it primarily as a result of media reports relating to Facebook’s name change. There was general scepticism about whether the metaverse was a longer-term phenomenon, with comparisons repeatedly made to the Second Life platform. 
  • The study showed mixed opinions on the use of the metaverse for civic participation. While respondents saw the potential for e-commerce and gaming, they were more cautious regarding its use in e-participation – unlike with the much more concrete use of AR or VR. In particular, concern was mentioned about alienation from reality if political participation were to shift into immersive environments. This concern was not associated with the use of individual technologies. On the other hand, others saw promising potential in the modelling of cities in the metaverse as digital twins. In particular, respondents saw an opportunity here in being able to participate in a detailed reproduction of their urban environments, regardless of location. 
  •  The respondents expressed concerns about the democratic control of the metaverse. There was a specific concern that authoritarian states could undermine these virtual environments and exploit them for their own benefit. Respondents were also worried about the non-transparent design of environments by platform operators. The fear of manipulation by powerful individuals was repeatedly emphasised here. Many misgivings were addressed regarding the exclusivity of the metaverse in particular: on the one hand, in the sense of the digital divide, generational differences determine who can take part in participation settings in the metaverse, but above all it was feared that people with visual impairments or poor vision would be excluded from the metaverse itself and, accordingly, also from participation settings in this immersive environment. At the same time, the view was expressed that participation projects in the metaverse could be interesting for children and young people in particular and would encourage them to get involved. 

In summary, it can be said that the respondents in the qualitative study responded cautiously to the idea of the metaverse as a place of participation. Unlike with the technologies behind it, AR and VR, the association with the platform operator seems to be more problematic when it comes to trust, which is the prerequisite for successful participation.

The presentation of the state of research has shown that use of immersive technologies in digital public participation is not yet a reality integrated into existing software. Their use can strongly motivate people to participate and inspire spatial imagination in participation processes.

Since participation processes are initiated at municipal or city level – and digitalisation of these administrative units is badly lagging behind, particularly in Germany – it is also not foreseeable that the metaverse will play an important role in public participation here in the next five years. Immersive technologies will certainly be used in various places, but it will probably not be the operators of e-participation platforms who will move to Web3 as early adopters. Financial and staff resources, particularly for development, are limited among these providers. By contrast, small, rich countries like Saudi Arabia are already focusing heavily on digital twins now, which is why it is more conceivable that these countries will approach the subject of digital twins in the metaverse as prestige projects and will experiment with participation projects here. However, local circumstances mean that the participation space would not happen in a democratic environment there. In the use of digital twins, it will also be interesting to see whether the real participation of people is desired here in the future, or whether their behaviour will increasingly be simulated. Here, the use of generative AI (generative agents and memory streams (Park et al., 2023)) represents an alternative to real participation, in which participatory behaviour could be simulated. These possibilities should also be monitored in order to be able to take a position if they should become reality in the context of digital twins or the metaverse. 

Perhaps the question that should be specifically asked now is what are the actual interesting applications for citizen participation in the metaverse. Use for small, non-representative participation processes seems to be the most feasible. For example, it could be exciting to enable the inclusion of diaspora communities in the planning of memorial sites, irrespective of where they currently live. Another useful setting could be participative gatherings as part of citizens’ councils. As a general trend in participation, these randomly selected advisory boards have gained importance in recent years, including in federal politics. The personal is in the foreground here, which could be interesting for the metaverse with its embodiment and implementation of real-time interactions in the form of avatars.

The following challenges still stand in the way of enabling citizen participation in the metaverse: high costs and the availability of hardware; the performance of hardware, which involves its own challenges with short battery life and strong light sensitivity; the interconnectivity and interoperability of platform and hardware standards. These practical points create an obstacle when it comes to designing inclusive participation processes in the metaverse. As mentioned above, scepticism towards platform operators is also an obstacle to usage. In addition, the desire for personal interaction and the use of avatars contrast with the need for anonymity in participation processes. Nevertheless, it is fascinating and important to consider how the metaverse could also be used for democratic procedures and how it might complement existing analogue formats. There is huge potential to clearly illustrate the subject matter of participation and organise gatherings from different places on the subject matter. This gives rise to a task for e-participation platform operators, virtual environments like Meta, and research: when designing the metaverse, consideration should be given from the outset to how the spaces can be designed in order to enable democratic participation. Mistakes made in the case of social media – where content moderation issues and thus also the level of hate speech always depend on the willingness of the platforms themselves to take action against these occurrences – could be completely rethought for the metaverse. It is to be hoped that places for democratic experiments will emerge and be researched in the metaverse and that, conversely, it does not come to be known for exclusion and hate. 

Brettschneider, F. (2013). Großprojekte zwischen Protest und Akzeptanz: Legitimation durch Kommunikation. In F. Brettschneider & W. Schuster (Eds.), Stuttgart 21: Ein Großprojekt zwischen Protest und Akzeptanz (pp. 319–328). Springer Fachmedien Wiesbaden. https://doi.org/10.1007/978-3-658-01380-6_12

Democracy Technologies. (2023). Report on Democracy Technologies in Europe. Democracy Technologies. https://democracy-technologies.org/report-2023/

Dolata, M., & Schwabe, G. (2023). What is the Metaverse and who seeks to define it? Mapping the site of social construction. Journal of Information Technology, 02683962231159927. https://doi.org/10.1177/02683962231159927

Dwivedi, Y. K., Hughes, L., Baabdullah, A. M., Ribeiro-Navarrete, S., Giannakis, M., Al-Debei, M. M., Dennehy, D., Metri, B., Buhalis, D., Cheung, C. M. K., Conboy, K., Doyle, R., Dubey, R., Dutot, V., Felix, R., Goyal, D. P., Gustafsson, A., Hinsch, C., Jebabli, I., … Wamba, S. F. (2022). Metaverse beyond the hype: Multidisciplinary perspectives on emerging challenges, opportunities, and agenda for research, practice and policy. International Journal of Information Management66, 102542. https://doi.org/10.1016/j.ijinfomgt.2022.102542

Fegert, J. (2022). Digital Citizen Participation: Involving Citizens Through Immersive Systems in Urban Planning. https://doi.org/10.5445/IR/1000147640

Fegert, J., Pfeiffer, J., Peukert, C., Golubyeva, A., & Weinhardt, C. (2020, December 14). Combining e-Participation with Augmented and Virtual Reality: Insights from a Design Science Research Project. ICIS 2020 Proceedings. https://aisel.aisnet.org/icis2020/digitization_in_cities/digitization_in_cities/4

Fegert, J., Stein, C., Peukert, C., & Weinhardt, C. (2021). Using e-Participation for Mission Statement Development: Promises and Challenges. Proceedings of the 19th International Conference “e-Society”, 2021. Ed.: P. Kommers, 233. https://publikationen.bibliothek.kit.edu/1000130538

Kim, S., & Lee, J. (2019). Gender and E-Participation in Local Governance: Citizen E-Participation Values and Social Ties. International Journal of Public Administration42(13), 1073–1083. https://doi.org/10.1080/01900692.2019.1575669

Lombard, M., & Ditton, T. (1997). At the Heart of It All: The Concept of Presence. Journal of Computer-Mediated Communication3(JCMC321). https://doi.org/10.1111/j.1083-6101.1997.tb00072.x

Macintosh, A. (2004). Characterizing E-Participation in Policy-Making. Proceedings of the Proceedings of the 37th Annual Hawaii International Conference on System Sciences (HICSS’04) – Track 5 – Volume 5, 50117.1-. http://dl.acm.org/citation.cfm?id=962753.962993

Novo Vázquez, A., & Vicente, M. R. (2019). Exploring the Determinants of e-Participation in Smart Cities. In M. P. Rodríguez Bolívar & L. Alcaide Muñoz (Eds.), E-Participation in Smart Cities: Technologies and Models of Governance for Citizen Engagement (pp. 157–178). Springer International Publishing. https://doi.org/10.1007/978-3-319-89474-4_8

Park, J. S., O’Brien, J. C., Cai, C. J., Morris, M. R., Liang, P., & Bernstein, M. S. (2023). Generative agents: Interactive simulacra of human behavior. arXiv Preprint arXiv:2304.03442.

Paulina Porten. (2020, March). Paulina Porten—Augmented Participation. https://paulinaporten.com/augmented-participation-1

Rauschnabel, P. A., Felix, R., Hinsch, C., Shahab, H., & Alt, F. (2022). What is XR? Towards a Framework for Augmented and Virtual Reality. Computers in Human Behavior133, 107289. https://doi.org/10.1016/j.chb.2022.107289

Robinson, K. (2023, November 7). Meta exec: “Metaverse hype is dead.” Fortune. https://fortune.com/2023/07/11/meta-vp-vishal-shah-happy-metaverse-hype-is-dead/

Rottinghaus, B., & Escher, T. (2020). Mechanisms for inclusion and exclusion through digital political participation: Evidence from a comparative study of online consultations in three German cities. Zeitschrift Für Politikwissenschaft30(2), 261–298. https://doi.org/10.1007/s41358-020-00222-7

Spirakis, G., Spiraki, C., & Nikolopoulos, K. (2010). The impact of electronic government on democracy: E-democracy through e-participation. EG7, 75–88. https://doi.org/10.1504/EG.2010.029892

Statistik Austria. (2021). Ein Blick auf die Gemeinde Langenegg. https://www.statistik.at/blickgem/G0201/g80223.pdf

Stephenson, N. (1992). Snow crash. Bantam Books.

Suh, K.-S., & Lee, Y. E. (2005). The Effects of Virtual Reality on Consumer Learning: An Empirical Investigation. MIS Quarterly29(4), 673–697. JSTOR. https://doi.org/10.2307/25148705

Taylor, C., Nanz, P., & Taylor, M. B. (2020). Reconstructing Democracy: How Citizens Are Building from the Ground Up. Harvard University Press.

Thaa, W. (2013). „Stuttgart 21“ – Krise oder Repolitisierung der repräsentativen Demokratie? Politische Vierteljahresschrift54(1), 1–20. JSTOR.

United Nations. (2020, April 15). Embracing digital government during the pandemic and beyond. Economic Analysis & Policy Division. https://www.un.org/development/desa/dpad/publication/un-desa-policy-brief-61-covid-19-embracing-digital-government-during-the-pandemic-and-beyond/

Wolfsfeld, G., Segev, E., & Sheafer, T. (2013). Social Media and the Arab Spring: Politics Comes First. The International Journal of Press/Politics18(2), 115–137. https://doi.org/10.1177/1940161212471716

Xi, N., Chen, J., Gama, F., Riar, M., & Hamari, J. (2023). The challenges of entering the metaverse: An experiment on the effect of extended reality on workload. Information Systems Frontiers25(2), 659–680. https://doi.org/10.1007/s10796-022-10244-x

Narrative, creative – immersive? The immersive potential of right-wing extremist communication on social media

Sandra Kero
Center for Advanced Internet Studies (CAIS), Bochum


Josephine B. Schmitt 
Center for Advanced Internet Studies (CAIS), Bochum 


Social media is ubiquitous. More and more people are using platforms such as Facebook, Instagram and TikTok to share their experiences, opinions and interests directly with others. But what does this mean for our experiences in these digital spaces – and what effects does it have on the formation of our political opinions? In particular, the immersive experiences that users have in these media environments can trigger powerful emotional effects and influence political or ideological attitudes. In this blog article, we look at these immersive effects of social media platforms. Alongside a detailed examination of what exactly immersiveness can mean in this context, we present mechanisms that contribute to the immersiveness of right-wing communication on social media, taking into account concepts and theories from media science, communication science and psychology. Based on these findings, we then provide recommendations for action aimed at developing preventive and repressive measures to counter the improper and antidemocratic use of immersive environments.

INTRODUCTION

Social media platforms like Instagram or TikTok are a popular tool of right-wing actors. As political outsiders, they exploit the opportunities that social media offers to establish them-selves as independent voices outside journalistic media for the purpose of orchestrating their narratives, channelling their political opinions and ideological attitudes and recruiting new members (Fielitz & Marcks, 2020; Rau et al., 2022; Schmitt, Harles, et al., 2020; Schwarz, 2020). On Instagram, right-wing groups deliberately rely on creators, lifestyle and a connection to nature to make their ideology more easily digestible (Echtermann et al., 2020). As a result, ideological content is not only subtly presented – often, the corresponding ideological perspectives are explicitly stated (Kero, in press). But also TikTok, which is particularly popu-lar among young users (mpfs, 2022), is becoming increasingly important as a platform for the far-right to interact with target groups that have not yet formed firm political views (pre:bunk, 2023). The range of content provided is broad here too: alongside supposedly humorous and musical content, there is also material from AfD politicians who showcase their beliefs. Extreme right-wing ideas are made accessible through normalisation strategies, e.g. pseudo-scientific misinformation and emotionalisation (Müller, 2022).

Social media platforms not only give everyone the opportunity to enter the public political discourse – the disseminated content also allows boundaries between political information, entertainment and social issues to become blurred. At the same time, various functions of social media facilitate users’ immersive experiences – something that far-right extremists in particular can benefit from in their communication practices. 

This article aims to contextualise the immersiveness of extreme right-wing content on social media and present selected immersive mechanisms within these environments as examples. On the basis of this, preventive measures are identified. 

WHAT DO WE UNDERSTAND BY IMMERSION?

Immersion essentially means being deeply absorbed or engrossed in a particular environment, activity or experience (Murray, 1998). Immersion is often mentioned as a feature of digital technologies such as virtual reality (VR) or augmented reality (AR), or in the gaming industry (Mühlhoff & Schütz, 2019; Nilsson et al., 2016). From a media theory and media psychology perspective, the term primarily describes the state of dissolution of physical and fictitious boundaries, i.e. users’ subjective experiences of plunging into an imaginary world or narrative environment and being emotionally involved in it (e.g. E. Brown & Cairns, 2004; Haywood & Cairns, 2006). This is usually associated with a strong feeling of presence in which perception and attention are focused on the specific immersive content. Conversely, attention to time and (real) space decreases for the period of immersion (Cairns et al., 2014; Curran, 2018). Immersive environments and mechanisms can encourage engagement and learning (Dede, 2009), but can also promote the ideological effect of content among users (Braddock & Dillard, 2016).

Moving away from an understanding of the term that is limited to a technically or culturally versed or psychological phenomenon, Mühlhoff and Schütz (2019) consider immersion from an affect theory and social theory perspective. Here, immersion is described as a dynamic interaction between (non-)human individuals and between them and their environment, i.e. as “a specific mode of emotional and affective involvement in a current or mediatised social event” (p. 20). Affects are spontaneous, instinctive reactions that can be influenced both individually and socially and which shape a person’s behaviour (Strick, 2021, among others). The design of certain (media) environments can influence affective reactions in specific ways and thus also modulate power relations and social dynamics (Mühlhoff & Schütz, 2019). In this sense, certain immersion offerings can help to control and regulate behaviour, thus creating “immersive power” (p. 30). Immersion can therefore be considered a form of situational in-fluence exerted on the thoughts and feelings of the individual. It represents an indirect exercise of power that arises without hierarchies and through social contexts.

Immersion is considered in a wide range of scientific disciplines (e.g. psychology, computer science, cognitive science, design); therefore, the synonyms and closely related concepts are just as diverse. The literature thus includes terms such as presence, flow, involvement and engagement, which describe similar phenomena (e.g. Curran, 2018; Nilsson et al., 2016). To describe psychological immersion in the context of the reception of narrative media content, the term transportation is used in media psychology and communication science research (e.g. Moyer-Gusé, 2008). Immersion as a subjective experience of users is not limited to a specific medium or technology. Learning situations, books, films and social media content can therefore also have an immersive effect, as well as computer games or VR applications.

Against the backdrop of this integrated understanding of the term, we focus in this article on immersive mechanisms in and through social media and their content. We are therefore primarily concerned with the social and psychological dimension of immersiveness.

IMMERSIVE MECHANISMS IN SOCIAL MEDIA

Social media can contribute to immersive mechanisms and effects in various ways. With an understanding of immersion from a media theory and media psychology perspective, these experiences relate to the constant presence of smartphones and, with them, social media applications in our everyday lives, resulting in a merging of virtual, physical and social space. Users’ emotional involvement is also intensified through the primary type of use of social media platforms – such as direct sharing of everyday experiences, interests and opinions – and the (psychological) connection with individual producers of this content, hereinafter referred to as creators, and their posts. From a social theory perspective, a power structure is created here in which affective dynamics can be used in a targeted way to influence the behaviour, attitudes and perceptions of individuals. 

In the following, we aim to take a closer look at the underlying mechanisms in order to thoroughly examine the immersive potential of social media in the communication of extreme right-wing creators. A distinction is made between two contexts: the platform environment as an environment of social events, and the interaction between individuals and content in this environment.

Presenting the persona and the story

In the competition for the undivided attention of users, diverse far-right extremists are also taking to social media. The amount of extreme right-wing content has been steadily increasing for years (e.g. Munger & Phillips, 2022). Opinions are becoming things that can be sold. Social media marketing strategies are being used to attract and retain the attention of users (Cotter, 2019) and shape opinions in line with right-wing ideology. How the particular media persona is presented and the way in which the stories are told play a crucial role when it comes to the immersiveness and impact of the content provided.

A number of creators and influencers have emerged from the far-right scene in recent years. They are an essential prerequisite for reaching users, especially those who are otherwise not interested in political content. Generally speaking, influencers are previously unknown social media users who become well-known personalities through careful self-promotion and regular posting of content on social media, and who cover a range of topics on their channels (Bauer, 2016). 

Female activists in particular use popular social media platforms to present extreme right-wing ideology to a wide audience in a personal and emotive way through recipes, beauty tips and inspiring landscape photos (Ayyadi, 2021; Echtermann et al., 2020; Kero, in press); the boundaries between entertainment, lifestyle and politics are fluid. The creators act as role models (Zimmermann et al., 2022), are opinion leaders as regards content that threatens democracy (Harff et al., 2022), can attract users to content and channels, and encourage interaction via personal connections (Leite et al., 2022). The greater the trustworthiness and centrality of the creators, the more the recipients report immersive experiences with the media offering (Jung & Im, 2021).

Ein Bild, das Text, Screenshot, Collage, Menschliches Gesicht enthält.

Automatisch generierte BeschreibungEin Bild, das Text, Menschliches Gesicht, Screenshot, Person enthält.

Automatisch generierte Beschreibung
Figure 1: TikTok channel “victoria” – a mix of nature, homeland and historical scenesFigure 2: The youth organisation of the AfD – classified as an extremist group that threatens the constitution – posts satirical clips on TikTok  

Above all, it is the communicative presentation styles of the content producers that let recipients really immerse themselves in their everyday world. They tell stories of nature, homeland and a sovereign German people, and provide simple answers to complex political questions. Concepts of enemies as those responsible for social problems are soon found and quickly identified based on supposedly distinct characteristics (see Figure 1). Through their narratives, they not only engage to create identity and meaning, but also offer recipients a strong group that provides interested individuals with a framework and (political) orientation. Formats with an emphasis on humour, pop culture and youth culture are used to make content easy to digest and build recipients’ loyalty to channels and creators (Schmitt, Harles, et al., 2020, see also Figure 2). The network of right-wing actors on social media is dense; they refer to each others’ stories and narratives – including across platforms (Chadwick & Stanyer, 2022). An apparent consistency of narratives, i.e. the same story is told by several actors, makes a story appear even more credible.

MECHANISMS OF NARRATIVE PERSUASION ARE USED FOR SPECIFIC PURPOSES

Research in the fields of media psychology and communication science identifies three mechanisms that make stories convincing and thus make right-wing extremist communication particularly immersive and effective: transportation, identification and parasocial interactions (e.g. Braddock, 2020). 

Transportation describes the process whereby, in order to understand the narrative presented in a story, people must shift their attention from the real world around them to the world constructed in the narrative. Ideally, as a result of this psychological immersion, they lose their awareness of the real world and completely immerse themselves in the fictional world (Braddock, 2020). More engagement with the narrative world means less questioning of persuasive information (Igartua & Cachón-Ramón, 2023; Moyer-Gusé, 2008).

Identification means the adoption of the media character’s viewpoint or perspective by recipients. This happens, for example, where the media figure is perceived as being particularly similar. 

Parasocial interaction is a psychological process in which media users like and/or trust a media figure so much that they feel as if they are connected to them. Where this happens over a longer period, e.g. several videos or articles, it is also called a parasocial relationship. A parasocial interaction or relationship with a media figure reduces users’ reactance to the content presented and the desire to object to content. This facilitates the adoption of beliefs, attitudes and behaviours (Braddock, 2020).

Through narrative storytelling, the direct presentation of everyday situations, self-disclosure and directly addressing recipients, right-wing creators can increase the level of their authenticity and reinforce parasocial relationships with their followers. The more socially attractive a media figure is perceived to be (Masuda et al., 2022) and the greater the recipients’ identification with the media figure (Eyal & Dailey, 2012), the stronger the parasocial relationship. Social media creators offer young users in particular great potential for identification by presenting themselves as similar to them and approachable (Farivar & Wang, 2022; Schouten et al., 2020). ­­­­­An example of this is the account belonging to Freya Rosi, a young, right-wing creator. With tips on beauty, baking and cooking and photos of nature, she orchestrates the image of an approachable, traditional young far-right supporter who loves her homeland (a detailed analysis of the account is also found in Rösch, 2023). A particularly close and enduring parasocial relationship with social media protagonists can even lead to addiction-like behaviours among users (de Bérail et al., 2019). In other words: users have difficulty escaping the fictional world manufactured by the social media creators – in this context their content has a highly immersive effect. 

Ein Bild, das Kleidung, Menschliches Gesicht, Collage, Mädchen enthält.

Automatisch generierte BeschreibungEin Bild, das Text, Screenshot, Menschliches Gesicht, Frau enthält.

Automatisch generierte Beschreibung
Figure 3: Screenshots of Freya Rosi’s Instagram account. With tips on beauty, baking and cooking and photos of nature, the creator orchestrates the image of an approachable, traditional young far-right supporter who loves her homeland. A detailed analysis of the presentation strategies can also be found in Rösch, 2023

Self-disclosure by creators can lead to a higher level of parasocial interactions and relationships among followers via the feeling of social presence, i.e. perceiving the media figure as a natural person (Kim et al., 2019; Kim & Song, 2016), and to greater engagement with and commitment to communicators and their content (Osei-Frimpong & McLean, 2018). In turn, the feeling of social presence makes it less likely that people will check the truth of information (Jun et al., 2017). This is particularly problematic when dealing with extreme right-wing content, e.g. disinformation or conspiracy theories. 

EMOTIONALISATION OF CONTENT

In the context of their communication, radicalised communicators talk about fears, uncertainties and developmental tasks; they also use emotionalising content and images to make their content relatable and convince followers of their world view (Frischlich, 2021; Frischlich et al., 2021; Schmitt, Harles, et al., 2020; Schneider et al., 2019). In the so-called post-truth era, emotions, as well as perceived truths, are becoming a key manipulation tool of extreme right-wing creators. An experimental study by Lillie and colleagues (2021) indicates that narrative content which triggers fear among recipients leads to more flow experiences – i.e. the complete absorption of users in the reception of the content – and encourages behavioural intentions; on the other hand, fear reduces the willingness to contradict content. This makes it appear more convincing. 

The development of social media into platforms for creating, publishing and interacting with (audio-)visual content supports the communication and effect of this sort of content. In particular, (audio-)visual content is suitable for conveying emotional messages (Houwer & Hermans, 1994) and thus generating corresponding emotional responses. In turn, this affects information processing, as the brain pays more attention to processing the emotional reaction than to the information (Yang et al., 2023). Tritt et al. (2016) found that politically conservative people appear to respond more easily to emotional stimuli than liberals.

The immersive effects outlined so far may result in followers becoming more intensely involved in the world portrayed by creators, forging a deeper emotional bond and identifying more strongly with the content presented (Feng et al., 2021; Hu et al., 2020; Slater & Rouner, 2002; Wunderlich, 2023).

THE ROLE OF PLATFORM FUNCTIONS 

Time plays an important role on social media. Users’ attention must be captured quickly in order to attract them to the platform. It is fair to assume that immersive effects of communication on social media are intensified through the design and functional logic of the platforms. Platform-specific algorithms and dark patterns play a special role here. 

Algorithms determine relevance and importance

In addition to highlighting the relevance and importance of certain content, algorithms encourage the emotional involvement of users by preferentially showing them content that they or similar users potentially prefer. This is designed to attract users to the platform and content and retain them for as long as possible; it is part of the platforms’ business model. In this way, platforms let users plunge deeper and deeper into the virtual worlds. This makes them an immersive and effective recruitment tool. The algorithmic selection and sorting of particular content depends on various factors including the type of content, its placement, the language used, the degree of networking between those disseminating it and the responses from users (Peters & Puschmann, 2017). 

Commercial platforms prioritise posts that trigger emotional reactions, polarise opinions or promote intensive communication between members (Grandinetti & Bruinsma, 2023; Huszár et al., 2022; Morris, 2021). In particular, visual presentation modes such as photos, videos and memes trigger affective emotions and reactions among recipients and draw them in (Maschewski & Nosthoff, 2019). Algorithmic networking provides technical reinforcement of users’ emotional connection and involvement with the content. This also applies to contact with antidemocratic content. Studies suggest that social media algorithms can facilitate the spread of extreme right-wing activities (Whittaker et al., 2021; Yesilada & Lewandowsky, 2022) – sometimes similar keywords are enough, as with unproblematic content (Schmitt et al., 2018). In particular, people who are interested in niche topics (e.g. conspiracy theories) (Ledwich et al., 2022) and who follow platform recommendations for longer (M. A. Brown et al., 2022) are manoeuvred into corresponding filter bubbles by the algorithms. At the same time, however, the majority of users are shown moderate content (M. A. Brown et al., 2022; Ledwich et al., 2022). 

In recent years, TikTok has faced scrutiny particularly because of its strong algorithmic control. This makes it even more likely that users will unintentionally come into contact with radicalised content (Weimann & Masri, 2021). In addition, the stream of content continues to run if users do not actively interrupt it, thus creating a much more immersive experience on the platform compared to other social media (Su, Zhou, Gong, et al., 2021; Su, Zhou, Wang, et al., 2021). Sometimes the platform is even said to have addictive potential due to its functions (e.g. Qin et al., 2022). It is difficult to break away from, and this poses a great danger for users, especially given the increase in extremist communication on TikTok. 

Dark patterns as mechanisms to enhance immersion

Dark patterns also play a role in discussions around the immersive effect of social media. These describe platforms’ design decisions that use deceptive or manipulative tactics to keep users on the platforms and encourage them to make (uninformed) usage decisions (Gray et al., 2023). On social media, dark patterns are primarily aimed at retaining users’ attention for long periods, ensuring excessive and immersive use of the platforms and getting users to perform actions (e.g. likes). Examples of dark patterns are: activity notifications, the major challenge on Facebook of really logging out of the platform or even deleting an account, or the often counter-intuitive colours of cookie notifications which mean that users accept all of them, including unnecessary ones. Nearly all popular platforms and applications have dark patterns; many users do not notice them (Bongard-Blanchy et al., 2021; Di Geronimo et al., 2020). Creators also aim to influence user behaviour through dark patterns. For example, they manipulate popularity metrics (e.g. likes, number of followers) and images to try to artificially boost their credibility and generate attention for their content (Luo et al., 2022).

WHAT IS AN EFFECTIVE PREVENTIVE RESPONSE?

The immersiveness of extremist social media communication poses a major challenge in terms of preventive measures. This relates in particular to the many different levels that must be considered in this context. In the following, the considerations of prevention measures are specified at a) content level, b) platform level and c) media level. To clearly define the measures from the users’ point of view, we will use the three prevention levels of awareness, reflection and empowerment (Schmitt, Ernst, et al., 2020). Awareness includes a general consideration or a general awareness (e.g. of one’s own usage behaviour, extremist narratives, platform functioning), while reflection refers to critical reflection on content and the functioning of the media. Empowerment, however, means the ability of recipients to position themselves as regards certain media content or functions and be capable of acting. 

At content level (a) awareness must be raised about extremist content and its specific digital forms of communication and representation on social media. This is therefore about creating awareness of how right-wing extremist creators communicate in order to gain the attention of social media users, generate interest in their narratives and ultimately motivate people to act in accordance with the far-right world view (Schmitt, Ernst, et al., 2020). Furthermore, users should be enabled to take a critical stance as regards extremist content so they can position themselves accordingly in social discourse. This positioning does not only have to take place in the context of political discussions – the fact of reporting content to the platform itself or to one of the common reporting platforms (e.g. Jugendschutz.net) also indicates a stance. As well as a strong ability to take a critical view of media, it is particularly important here to have historic, intercultural and political knowledge. Tolerance of ambiguity, which enables people to tolerate complex and possibly contradictory information, is also important.

Platform-related prevention measures (b) should enable users to be aware, understand and reflect on algorithmic functional logic (e.g. what am I being shown and why?) and raise awareness of the mechanisms of action of the platform design (e.g. how is content being shown to me? How is that affecting my actions?) (Di Geronimo et al., 2020; Silva et al., 2022; Taylor & Brisini, 2023). For a school context, there is, for example, the second learning package in the CONTRA lesson series (Ernst et al., 2020) which has been explicitly designed to facilitate the ability to criticise media with regard to algorithms. 

On the platform side, it is important to create transparency around algorithms. Disclosure of the platform design, such as clarifying the functional logic of the recommendation algorithm, can help people identify dark patterns – and thus also understand the potential spread dynamics of extremist messages and counteract these (Rau et al., 2022). In a social media context, dark patterns have been relatively poorly researched to date (Mildner et al., 2023). In this field, information from research activities is provided, for example, by the Dark Pattern Detection Project, which also enables users to report dark patterns. 

Political regulatory measures, such as those in the Network Enforcement Act (NetzDG) and the Digital Services Act (DSA), can also help as regards the disclosure of platform design and require transparency reports to be provided, for example. With a view to potential new virtual media environments – such as the metaverse – it is essential to enforce transparency rules and consider new forms of design. In what circumstances does the virtual interaction take place? How are the digital activity spaces designed? What criteria are used for the visual representation of the actors, e.g. avatars, and how inclusive is their design? 

On the media level (c), it is crucial to raise users’ awareness of their own media behaviour and encourage them to reflect on it. Among other things, this involves questions about the time spent using certain applications (e.g. how long have I spent on TikTok today?) and also the type of use[1] (e.g. what content and mechanisms are drawing me in?). Help could be provided by apps that set time limits or send warnings about the usage times of other apps or block them (e.g. StayFree or Forest). The digital wellbeing function, which can measure the usage time of individual apps, is already established in various smartphone brands. 

SUMMARY AND OUTLOOK

The statements above illustrate the levels on which the social media communication of far-right creators can have an immersive effect from a social and psychological perspective. This occurs both as a result of the communication styles they choose and also through the platforms themselves. When these are combined, a parallel world is created – a metaverse – that attracts users in a variety of ways and can therefore have an effect. 

From a research perspective, a number of unanswered questions arise. For example, TikTok, as an extremely popular platform, should be examined more thoroughly with regard to its immersive potential. The new possibilities offered by AI-based image generation tools also raise questions about the effect and immersiveness of synthetic imagery produced by extremist communicators. The topic of gaming and right-wing extremism has also gained a lot more attention in recent years (e.g. Amadeu Antonio Stiftung, 2022; Schlegel, no date, 2021). Becoming absorbed in virtual worlds is a key motivation for gamers (for an overview see, for example, Cairns et al., 2014). Currently, there are few reliable findings regarding exactly how far-right actors use gaming for their own purposes, what role avatars, 3D worlds and VR play here and what impact this can have in the context of radicalisation processes. However, some initial findings are available on how interactive games and VR can be used for prevention purposes (e.g. Bachen et al., 2015; Dishon & Kafai, 2022). Through a fun and immersive approach, more abstract topics like politics and democracy – including their controversial aspects – can be explored in an engaging and cooperative way. This makes it possible for people to take different points of view and learn and practise interactions that are also relevant in everyday life. Games can also promote knowledge, empathy and critical thinking. This potential should be explored more fully by researchers and those working in prevention. Although we have outlined a selection of considerations for the prevention of immersive extremist communication, those in the field of prevention are asked to address the various facets of immersiveness in greater detail and possibly use comparable mechanisms for their own goals, taking into account Germany’s prohibition on overwhelming students (Überwältigungsverbot) (bpb, 2011; for criticism of the Beutelsbach Consensus see also Widmaier & Zorn, 2016).

REFERENCES

Amadeu Antonio Stiftung. (2022). Unverpixelter Hass—Gaming zwischen Massenphänomen und rechtsextremen Radikalisierungsraum? https://www.amadeu-antonio-stiftung.de/neue-handreichung-unverpixelter-hass-gaming-zwischen-massenphaenomen-und-rechtsextremen-radikalisierungsraum-81173/

Ayyadi, K. (2021, August 25). Rechte Influencerinnen: Rechtsextreme Inhalte schön verpackt. Belltow-er.News. https://www.belltower.news/rechte-influencerinnen-rechtsextreme-inhalte-schoen-verpackt-120301/

Bachen, C. M., Hernández-Ramos, P. F., Raphael, C., & Waldron, A. (2015). Civic Play and Civic Gaps: Can Life Simulation Games Advance Educational Equity? Journal of Information Technology & Politics, 12(4), 378–395. https://doi.org/10.1080/19331681.2015.1101038

Bongard-Blanchy, K., Rossi, A., Rivas, S., Doublet, S., Koenig, V., & Lenzini, G. (2021). ”I am Definitely Manipulated, Even When I am Aware of it. It’s Ridiculous!”—Dark Patterns from the End-User Perspective. Proceedings of the 2021 ACM Designing Interactive Systems Conference, 763–776. https://doi.org/10.1145/3461778.3462086

bpb. (2011, April 7). Beutelsbacher Konsens. bpb.de. https://www.bpb.de/die-bpb/ueber-uns/auftrag/51310/beutelsbacher-konsens/

Braddock, K. (2020). Narrative Persuasion and Violent Extremism: Foundations and Implications. In J. B. Schmitt, J. Ernst, D. Rieger, & H.-J. Roth (Hrsg.), Propaganda und Prävention: Forschungser-gebnisse, didaktische Ansätze, interdisziplinäre Perspektiven zur pädagogischen Arbeit zu ext-remistischer Internetpropaganda (S. 527–538). Springer Fachmedien. https://doi.org/10.1007/978-3-658-28538-8_28

Braddock, K., & Dillard, J. P. (2016). Meta-analytic evidence for the persuasive effect of narratives on beliefs, attitudes, intentions, and behaviors. Communication Monographs, 83(4), 446–467. https://doi.org/10.1080/03637751.2015.1128555

Brown, E., & Cairns, P. (2004). A grounded investigation of game immersion. CHI ’04 Extended Ab-stracts on Human Factors in Computing Systems, 1297–1300. https://doi.org/10.1145/985921.986048

Brown, M. A., Bisbee, J., Lai, A., Bonneau, R., Nagler, J., & Tucker, J. A. (2022). Echo Chambers, Rabbit Holes, and Algorithmic Bias: How YouTube Recommends Content to Real Users (SSRN Scholar-ly Paper 4114905). https://doi.org/10.2139/ssrn.4114905

Cairns, P., Cox, A., & Nordin, A. I. (2014). Immersion in Digital Games: Review of Gaming Experience Research. In Handbook of Digital Games (S. 337–361). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781118796443.ch12

Chadwick, A., & Stanyer, J. (2022). Deception as a Bridging Concept in the Study of Disinformation, Misinformation, and Misperceptions: Toward a Holistic Framework. Communication Theory, 32(1), 1–24. https://doi.org/10.1093/ct/qtab019

Cotter, K. (2019). Playing the visibility game: How digital influencers and algorithms negotiate influ-ence on Instagram. New Media & Society, 21(4), 895–913. https://doi.org/10.1177/1461444818815684

Curran, N. (2018). Factors of Immersion. In The Wiley Handbook of Human Computer Interaction (S. 239–254). John Wiley & Sons, Ltd. https://doi.org/10.1002/9781118976005.ch13

de Bérail, P., Guillon, M., & Bungener, C. (2019). The relations between YouTube addiction, social anxie-ty and parasocial relationships with YouTubers: A moderated-mediation model based on a cognitive-behavioral framework. Computers in Human Behavior, 99, 190–204. https://doi.org/10.1016/j.chb.2019.05.007

Dede, C. (2009). Immersive Interfaces for Engagement and Learning. Science, 323(5910), 66–69. https://doi.org/10.1126/science.1167311

Di Geronimo, L., Braz, L., Fregnan, E., Palomba, F., & Bacchelli, A. (2020). UI Dark Patterns and Where to Find Them: A Study on Mobile Applications and User Perception. Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems, 1–14. https://doi.org/10.1145/3313831.3376600

Dishon, G., & Kafai, Y. B. (2022). Connected civic gaming: Rethinking the role of video games in civic education. Interactive Learning Environments, 30(6), 999–1010. https://doi.org/10.1080/10494820.2019.1704791

Echtermann, A., Steinberg, A., Diaz, C., Kommerell, C., & Eckert, T. (2020). Kein Filter für Rechts. cor-rectiv.org. https://correctiv.org/top-stories/2020/10/06/kein-filter-fuer-rechts-instagram-rechtsextremismus-frauen-der-rechten-szene/?lang=de

Ernst, J., Schmitt, J. B., Rieger, D., & Roth, H.-J. (2020). #weARE – Drei Lernarrangements zur Förderung von Medienkritikfähigkeit im Umgang mit Online-Propaganda in der Schule. In J. B. Schmitt, J. Ernst, D. Rieger, & H.-J. Roth (Hrsg.), Propaganda und Prävention: Forschungsergebnisse, di-daktische Ansätze, interdisziplinäre Perspektiven zur pädagogischen Arbeit zu extremistischer Internetpropaganda (S. 361–393). Springer Fachmedien. https://doi.org/10.1007/978-3-658-28538-8_17

Eyal, K., & Dailey, R. M. (2012). Examining Relational Maintenance in Parasocial Relationships. Mass Communication and Society, 15(5), 758–781. https://doi.org/10.1080/15205436.2011.616276

Farivar, S., & Wang, F. (2022). Effective influencer marketing: A social identity perspective. Journal of Retailing and Consumer Services, 67, 103026. https://doi.org/10.1016/j.jretconser.2022.103026

Feng, Y., Chen, H., & Kong, Q. (2021). An expert with whom i can identify: The role of narratives in in-fluencer marketing. International Journal of Advertising, 40(7), 972–993. https://doi.org/10.1080/02650487.2020.1824751

Fielitz, M., & Marcks, H. (2020). Digitaler Faschismus: Die sozialen Medien als Motor des Rechtsext-remismus. Dudenverlag.

Frischlich, L. (2021). #Dark inspiration: Eudaimonic entertainment in extremist Instagram posts. New Media & Society, 23(3), 554–577. https://doi.org/10.1177/1461444819899625

Frischlich, L., Hahn, L., & Rieger, D. (2021). The Promises and Pitfalls of Inspirational Media: What do We Know, and Where do We Go from Here? Media and Communication, 9(2), 162–166. https://doi.org/10.17645/mac.v9i2.4271

Grandinetti, J., & Bruinsma, J. (2023). The Affective Algorithms of Conspiracy TikTok. Journal of Broad-casting & Electronic Media, 67(3), 274–293. https://doi.org/10.1080/08838151.2022.2140806

Gray, C. M., Sanchez Chamorro, L., Obi, I., & Duane, J.-N. (2023). Mapping the Landscape of Dark Patterns Scholarship: A Systematic Literature Review. Companion Publication of the 2023 ACM Designing Interactive Systems Conference, 188–193. https://doi.org/10.1145/3563703.3596635

Harff, D., Bollen, C., & Schmuck, D. (2022). Responses to Social Media Influencers’ Misinformation about COVID-19: A Pre-Registered Multiple-Exposure Experiment. Media Psychology, 25(6), 831–850. https://doi.org/10.1080/15213269.2022.2080711

Haywood, N., & Cairns, P. (2006). Engagement with an Interactive Museum Exhibit. In T. McEwan, J. Gulliksen, & D. Benyon (Hrsg.), People and Computers XIX — The Bigger Picture (S. 113–129). Springer. https://doi.org/10.1007/1-84628-249-7_8

Houwer, J. D., & Hermans, D. (1994). Differences in the affective processing of words and pictures. Cognition and Emotion, 8(1), 1–20. https://doi.org/10.1080/02699939408408925

Hu, L., Min, Q., Han, S., & Liu, Z. (2020). Understanding followers’ stickiness to digital influencers: The effect of psychological responses. International Journal of Information Management, 54, 102169. https://doi.org/10.1016/j.ijinfomgt.2020.102169

Huszár, F., Ktena, S. I., O’Brien, C., Belli, L., Schlaikjer, A., & Hardt, M. (2022). Algorithmic amplification of politics on Twitter. Proceedings of the National Academy of Sciences, 119(1), e2025334119. https://doi.org/10.1073/pnas.2025334119

Igartua, J.-J., & Cachón-Ramón, D. (2023). Personal narratives to improve attitudes towards stigma-tized immigrants: A parallel-serial mediation model. Group Processes & Intergroup Relations, 26(1), 96–119. https://doi.org/10.1177/13684302211052511

Jun, Y., Meng, R., & Johar, G. V. (2017). Perceived social presence reduces fact-checking. Proceedings of the National Academy of Sciences, 114(23), 5976–5981. https://doi.org/10.1073/pnas.1700175114

Jung, N., & Im, S. (2021). The mechanism of social media marketing: Influencer characteristics, con-sumer empathy, immersion, and sponsorship disclosure. International Journal of Advertising, 40(8), 1265–1293. https://doi.org/10.1080/02650487.2021.1991107

Kero, S. (im Druck). Jung, weiblich und extrem rechts. Die narrative Kommunikation weiblicher Akteu-rinnen auf der Plattform Instagram. easy_social sciences.

Kim, J., Kim, J., & Yang, H. (2019). Loneliness and the use of social media to follow celebrities: A mod-erating role of social presence. The Social Science Journal, 56(1), 21–29. https://doi.org/10.1016/j.soscij.2018.12.007

Kim, J., & Song, H. (2016). Celebrity’s self-disclosure on Twitter and parasocial relationships: A medi-ating role of social presence. Computers in Human Behavior, 62, 570–577. https://doi.org/10.1016/j.chb.2016.03.083

Ledwich, M., Zaitsev, A., & Laukemper, A. (2022). Radical bubbles on YouTube? Revisiting algorithmic extremism with personalised recommendations. First Monday. https://doi.org/10.5210/fm.v27i12.12552

Leite, F. P., Pontes, N., & Baptista, P. de P. (2022). Oops, I’ve overshared! When social media influenc-ers’ self-disclosure damage perceptions of source credibility. Computers in Human Behavior, 133, 107274. https://doi.org/10.1016/j.chb.2022.107274

Lillie, H. M., Jensen, J. D., Pokharel, M., & Upshaw, S. J. (2021). Death Narratives, Negative Emotion, and Counterarguing: Testing Fear, Anger, and Sadness as Mechanisms of Effect. Journal of Health Communication, 26(8), 586–595. https://doi.org/10.1080/10810730.2021.1981495

Luo, M., Hancock, J. T., & Markowitz, D. M. (2022). Credibility Perceptions and Detection Accuracy of Fake News Headlines on Social Media: Effects of Truth-Bias and Endorsement Cues. Communi-cation Research, 49(2), 171–195. https://doi.org/10.1177/0093650220921321

Maschewski, F., & Nosthoff, A.-V. (2019). Netzwerkaffekte. Über Facebook als kybernetische Regie-rungsmaschine und das Verschwinden des Subjekts. In R. Mühlhoff, A. Breljak, & J. Slaby (Hrsg.), Affekt Macht Netz: Auf dem Weg zu einer Sozialtheorie der Digitalen Gesellschaft (1. Aufl., Bd. 22, S. 55–80). transcript Verlag. https://doi.org/10.14361/9783839444399

Masuda, H., Han, S. H., & Lee, J. (2022). Impacts of influencer attributes on purchase intentions in so-cial media influencer marketing: Mediating roles of characterizations. Technological Fore-casting and Social Change, 174, 121246. https://doi.org/10.1016/j.techfore.2021.121246

Mildner, T., Savino, G.-L., Doyle, P. R., Cowan, B. R., & Malaka, R. (2023). About Engaging and Govern-ing Strategies: A Thematic Analysis of Dark Patterns in Social Networking Services. Proceed-ings of the 2023 CHI Conference on Human Factors in Computing Systems, 1–15. https://doi.org/10.1145/3544548.3580695

Morris, L. (2021, Oktober 27). In Poland’s politics, a ‚so- cial civil war‘ brewed as Facebook rewarded online anger. The Washington Post. https://www.washingtonpost.com/world/2021/10/27/poland-facebook-algorithm/

Moyer-Gusé, E. (2008). Toward a Theory of Entertainment Persuasion: Explaining the Persuasive Effects of Entertainment-Education Messages. Communication Theory, 18(3), 407–425. https://doi.org/10.1111/j.1468-2885.2008.00328.x

mpfs. (2022). JIM-Studie 2022. Jugend, Information, Medien. Basisuntersuchung zum Medienumgang 12- bis 19-Jähriger. https://www.mpfs.de/fileadmin/files/Studien/JIM/2022/JIM_2022_Web_final.pdf

Mühlhoff, R., & Schütz, T. (2019). Die Macht der Immersion. Eine affekttheoretische Perspektive. https://doi.org/10.25969/MEDIAREP/12593

Müller, P. (2022). Extrem rechte Influencer*innen auf Telegram: Normalisierungsstrategien in der Corona-Pandemie. ZRex – Zeitschrift für Rechtsextremismusforschung, 2(1), Art. 1. https://www.budrich-journals.de/index.php/zrex/article/view/39600

Munger, K., & Phillips, J. (2022). Right-Wing YouTube: A Supply and Demand Perspective. The Interna-tional Journal of Press/Politics, 27(1), 186–219. https://doi.org/10.1177/1940161220964767

Murray, J. H. (1998). Hamlet on the Holodeck: The Future of Narrative in Cyberspace. The MIT Press.

Nilsson, N. Chr., Nordahl, R., & Serafin, S. (2016). Immersion revisited: A review of existing definitions of immersion and their relation to different theories of presence. Human Technology, 12(2), 108–134.

Osei-Frimpong, K., & McLean, G. (2018). Examining online social brand engagement: A social presence theory perspective. Technological Forecasting and Social Change, 128, 10–21. https://doi.org/10.1016/j.techfore.2017.10.010

Peters, I., & Puschmann, C. (2017). Informationsverbreitung in sozialen Medien. In J.-H. Schmidt & M. Taddicken (Hrsg.), Handbuch soziale Medien (S. 211–232). Springer VS.

pre:bunk. (2023). Rechtsextremismus und TikTok, Teil 1: Von Hatefluencer*innen über AfD und bis rechter Terror. Belltower.News. https://www.belltower.news/rechtsextremismus-und-tiktok-teil-1-147835/

Qin, Y., Omar, B., & Musetti, A. (2022). The addiction behavior of short-form video app TikTok: The information quality and system quality perspective. Frontiers in Psychology, 13. https://www.frontiersin.org/articles/10.3389/fpsyg.2022.932805

Rau, J., Kero, S., Hofmann, V., Dinar, C., & Heldt, A. P. (2022). Rechtsextreme Online-Kommunikation in Krisenzeiten: Herausforderungen und Interventionsmöglichkeiten aus Sicht der Rechtsextre-mismus- und Platform-Governance-Forschung. Arbeitspapiere des Hans-Bredow-Instituts. https://doi.org/10.21241/SSOAR.78072

Rösch, V. (2023). Heimatromantik und rechter Lifestyle. Die rechte Influencerin zwischen Self-Branding und ideologischem Traditionalismus. GENDER – Zeitschrift für Geschlecht, Kultur und Gesell-schaft, 15(2), Art. 2. https://www.budrich-journals.de/index.php/gender/article/view/41966

Schlegel, L. (o. J.). Super Mario Brothers Extreme. Gaming und Rechtsextremismus. Abgerufen 9. Sep-tember 2023, von https://gaming-rechtsextremismus.de/themen/super-mario-brothers-extreme/

Schlegel, L. (2021). The Role of Gamification in Radicalization Processes. modus | zad. https://modus-zad.de/publikation/report/working-paper-1-2021-the-role-of-gamification-in-radicalization-processes/

Schmitt, J. B., Ernst, J., Rieger, D., & Roth, H.-J. (2020). Die Förderung von Medienkritikfähigkeit zur Prävention der Wirkung extremistischer Online-Propaganda. In J. B. Schmitt, J. Ernst, D. Rieger, & H.-J. Roth (Hrsg.), Propaganda und Prävention: Forschungsergebnisse, didaktische Ansätze, interdisziplinäre Perspektiven zur pädagogischen Arbeit zu extremistischer Internetpropagan-da (S. 29–44). Springer Fachmedien. https://doi.org/10.1007/978-3-658-28538-8_2

Schmitt, J. B., Harles, D., & Rieger, D. (2020). Themen, Motive und Mainstreaming in rechtsextremen Online-Memes. Medien & Kommunikationswissenschaft, 68(1–2), 73–93. https://doi.org/10.5771/1615-634X-2020-1-2-73

Schmitt, J. B., Rieger, D., Rutkowski, O., & Ernst, J. (2018). Counter-messages as Prevention or Promoti-on of Extremism?! The Potential Role of YouTube Recommendation Algorithms. Journal of Communication, 68(4), 780–808. https://doi.org/10.1093/joc/jqy029

Schneider, J., Schmitt, J. B., Ernst, J., & Rieger, D. (2019). Verschwörungstheorien und Kriminalitäts-furcht in rechtsextremen und islamistischen YouTube-Videos. Praxis der Rechtspsychologie, 1, Art. 1.

Schouten, A. P., Janssen, L., & Verspaget, M. (2020). Celebrity vs. Influencer endorsements in adverti-sing: The role of identification, credibility, and Product-Endorser fit. International Journal of Advertising, 39(2), 258–281. https://doi.org/10.1080/02650487.2019.1634898

Schwarz, K. (2020). Hasskrieger: Der neue globale Rechtsextremismus. Herder.

Silva, D. E., Chen, C., & Zhu, Y. (2022). Facets of algorithmic literacy: Information, experience, and indi-vidual factors predict attitudes toward algorithmic systems. New Media & Society, 14614448221098042. https://doi.org/10.1177/14614448221098042

Slater, M. D., & Rouner, D. (2002). Entertainment—Education and Elaboration Likelihood: Under-standing the Processing of Narrative Persuasion. Communication Theory, 12(2), 173–191. https://doi.org/10.1111/j.1468-2885.2002.tb00265.x

Strick, S. (2021). Rechte Gefühle: Affekte und Strategien des digitalen Faschismus. Transcript.

Su, C., Zhou, H., Gong, L., Teng, B., Geng, F., & Hu, Y. (2021). Viewing personalized video clips recom-mended by TikTok activates default mode network and ventral tegmental area. NeuroImage, 237, 118136. https://doi.org/10.1016/j.neuroimage.2021.118136

Su, C., Zhou, H., Wang, C., Geng, F., & Hu, Y. (2021). Individualized video recommendation modulates functional connectivity between large scale networks. Human Brain Mapping, 42(16), 5288–5299. https://doi.org/10.1002/hbm.25616

Taylor, S. H., & Brisini, K. S. C. (2023). Parenting the Tiktok Algorithm: An Algorithm Awareness as Pro-cess Approach to Online Risks and Opportunities for Teens on Tiktok (SSRN Scholarly Paper 4446575). https://doi.org/10.2139/ssrn.4446575

Tritt, S. M., Peterson, J. B., Page-Gould, E., & Inzlicht, M. (2016). Ideological reactivity: Political conser-vatism and brain responsivity to emotional and neutral stimuli. Emotion, 16(8), 1172–1185. https://doi.org/10.1037/emo0000150

Weimann, G., & Masri, N. (2021). TikTok’s Spiral of Antisemitism. Journalism and Media, 2(4), Art. 4. https://doi.org/10.3390/journalmedia2040041

Whittaker, J., Looney, S.-0007-3114-0806, Reed, A., & Votta, F. (2021). Recommender systems and the amplification of extremist content. https://doi.org/10.14763/2021.2.1565

Widmaier, B., & Zorn, P. (Hrsg.). (2016). Brauchen wir den Beutelsbacher Konsens? Eine Debatte der politischen Bildung. Bundeszentrale für politische Bildung. https://www.lpb-mv.de/nc/publikationen/detail/brauchen-wir-den-beutelsbacher-konsens-eine-debatte-der-politischen-bildung/

Wunderlich, L. (2023). Parasoziale Meinungsführer? Eine qualitative Untersuchung zur Rolle von Social Media Influencer*innen im Informationsverhalten und in Meinungsbildungsprozessen junger Menschen. Medien & Kommunikationswissenschaft, 71(1–2), 37–60. https://doi.org/10.5771/1615-634X-2023-1-2-37

Yang, Y., Xiu, L., Chen, X., & Yu, G. (2023). Do emotions conquer facts? A CCME model for the impact of emotional information on implicit attitudes in the post-truth era. Humanities and Social Sci-ences Communications, 10(1), Art. 1. https://doi.org/10.1057/s41599-023-01861-1

Yesilada, M., & Lewandowsky, S. (2022). Systematic review: YouTube recommendations and proble-matic content. Internet policy review, 11(1), 1652. https://doi.org/10.14763/2022.1.1652

Zimmermann, D., Noll, C., Gräßer, L., Hugger, K.-U., Braun, L. M., Nowak, T., & Kaspar, K. (2022). In-fluencers on YouTube: A quantitative study on young people’s use and perception of videos about political and societal topics. Current Psychology, 41(10), 6808–6824. https://doi.org/10.1007/s12144-020-01164-7


[1] Tools such as Dataskop can show TikTok use.

Regulatory approaches to immersive worlds: an introduction to metaverse regulation

Matthias C. Kettemann
Department of Legal Theory and Future of Law, University of Innsbruck
Leibniz-Institute for Media Research | Hans-Bredow-Institut, Hamburg
Humboldt Institute for Internet and Society, Berlin

Martin Müller
Department of Legal Theory and Future of Law, University of Innsbruck




Caroline Böck
Department of Legal Theory and Future of Law, University of Innsbruck 




1 INTRODUCTION

Metaverses[1] are phenomenologically diverse, technically complex, offer huge economic potential, challenge traditional concepts of democratic codetermination and still largely lack legal foundations. “In the metaverse, you’ll be able to do almost anything you can imagine – get together with friends and family, work, learn, play, shop, create – as well as completely new experiences that don’t really fit how we think […] today,” said Meta’s Mark Zuckerberg, outlining his vision. Where people shop, learn, play, make statements, enter into contracts, that is where standards are relevant. Where we are active, where we express opinions, we come into contact and conflict with others. Standards in the metaverse – like standards in general – solve distribution problems, coordination problems, cooperation problems; they have a shaping, pacifying and balancing function. But who makes the rules for governing the metaverse, and for governing in the metaverses?

These fundamental questions are not easy to answer from a democratic theory perspective. But we can learn from history. In many respects, the metaverse is now where digital platforms were at the turn of the millennium: less regulated, offering huge potential. As these platforms have emerged, the communicative infrastructures of democratic public spheres have faced significant changes. With the metaverse, the challenges are accelerating. As regards platforms, institutional solutions for democratic reconnection have come to the forefront; they have even made it into the current German government’s coalition agreement.[2] This process has also established itself in the scientific field as the “constitutionalisation” of social media (Celeste/Heldt/Keller 2022; Celeste 2022; De Gregorio 2022). These steps have yet to be taken for the metaverse. An examination of some legal aspects of the metaverse makes a contribution to this. 

The key challenges of regulating the metaverse as a virtual, immersive and interactive space concept created by merging the physical and digital worlds are therefore found in the areas of data protection and privacy, security, content governance, interoperability, openness and democratic participation. This article focuses on some of these.

Following the introduction (1), two chapters examine the regulation of the metaverse (2) and selected legal questions relating to the application of rules in the metaverse (3). Future developments are then considered (4).[3]

2 REGULATING THE METAVERSE

2.1 The basics

So far, neither national nor European Union legislators have developed a well thought out regulatory approach for the emerging metaverses[4]; the stage of normative visions has currently been reached (European Commission 2023).[5] Nevertheless, individual features of the metaverse, its providers and the actants in it (the avatars) are governed by various regulations in the normative multilevel system between private rules, national law and EU law. The technical entry points to the metaverses are also regulated: VR glasses and similar connection devices are subject to existing product safety regulations.

The metaverse communication space is also bound by standards. They include standards under private law (what the platform allows) and national law (what the state allows) and increasingly also European law, due to the growing amount of regulations regarding digital services, markets, data and algorithms. 

2.2.  European regulatory approaches

The Digital Services Act (DSA)[6] is an EU Regulation that came into force in November 2022. The DSA and the Digital Markets Act (DMA), which was negotiated at the same time, aim to create a safer digital space in which the fundamental rights of all users of digital services are protected and to establish a level playing field to foster innovation, growth, and competitiveness in the European Single Market.

As stated in Article 2, point (1), of the DSA, the Regulation applies to intermediary services offered to recipients in the European Union, irrespective of whether the providers of the services have their place of establishment in the EU. The scope of the DMA is similarly regulated in Article 1, point (2), of the DMA. If offered to users in the EU, metaverses therefore fall within the scope of the DSA. Of the three intermediary services listed in the DSA, metaverses are regarded as hosting services: to present the virtual world and interaction with it, information provided by users must be stored on their behalf by operators, meaning that the requirements of hosting services in Article 3 (g) (iii) of the DSA are met. Fully decentralised metaverses are a special case. Here, there is not one hosting service operating the metaverse – there are multiple operators. 

Firstly, the operators of metaverses must comply with the regulations applicable to all intermediary services in Articles 11–15 of the DSA. Further obligations are now being introduced here in relation to the Electronic Commerce Directive. For example, points of contact for authorities, the Commission and users must be designated (Articles 11 and 12 of the DSA); this obligation should be met through an appropriate form of the legal notice as laid down in Section 5 of the Telemedia Act. The terms and conditions of all intermediary services must meet certain requirements (Article 14 of the DSA), and in particular ensure legally secure moderation behaviour that can be challenged. 

In addition to the minimum requirements for the content of terms and conditions as described in Article 14, point (1), of the DSA, Article 14, point (4), stipulates that the interests of users are to be taken into account when content is moderated and when complaints are handled by platforms. The fundamental rights of users are explicitly mentioned, such as the right to freedom of expression. Contrary to previous decisions by the Federal Court of Justice, it is therefore a matter of a direct horizontal commitment to fundamental rights by platforms, irrespective of their size (Quintais/Appelman/Fahy 2022).[7] Metaverse operators must therefore clearly state when and why they moderate and what legal remedies exist. Article 15 on transparency obligations is also relevant with regard to content moderation.

The Digital Markets Act (DMA)[8] attempts to restrict the economic power of the big tech platforms on digital markets. For the application of the DMA to metaverses, there is no designation of metaverses as central platform services, regardless of the future variable conditions of impact on the single market and the stable and sustainable position.

The Data Governance Act (DGA)[9] is the first legislative act at Union level to address data sharing. While the GDPR deals with the protection of personal data, the DGA initially aims to regulate the commercial use of data in general, i.e., of personal and non-personal data, and thus represents a reorientation of Union policy (Metzger/Schweitzer 2023).

The European Commission’s proposal for a data act[10] is the core component of the data strategy. The aim is to increase the amount of publicly available data. Currently, devices in the Internet of Things (IoT) generate huge quantities of data which usually remain with the manufacturers and can only be accessed in exceptional cases. Developing more data altruism or data trust models would bring added value for metaverse operators, including small operators.

The European Commission’s proposal for an artificial intelligence act[11] is a risk-based regulation (Ebers et al. 2021, 589 (589); De Gregorio/Dunn, 473 (488 ff.)), in which the use of AI systems is divided into various risk categories, with further regulations for higher risks for the fundamental rights of users.

It also appears possible for interoperability regulations, which are currently primarily geared towards communication services, to ensure a certain standardisation of data formats for metaverses. The regulations on data interoperability currently apply only to data intermediation services (Article 26, points 3 and 4, in conjunction with Article 29 of the Data Act) and operators of data spaces as defined in Article 28 ff. of the Data Act. However, if these services have the success anticipated by the Commission (European Commission, no date), large parts of the digital economy, e.g. operators of metaverses, will use data intermediation services and data spaces in the near future and will therefore inevitably have to follow the standardisation rules. 

2.3 General terms and conditions

It is apparent that the existing regulations only marginally regulate the main features of the metaverse – namely hardware, software and content – and depend very much on which metaverses will prevail in the market and how they are specifically designed. The significant actors are rather the above-mentioned digital companies with their contract-based private regulations.

3 REGULATION IN METAVERSES

3.1 Communication space 

The private regulations on digital platforms, like social media, are nowadays subject firstly to technical settings as well as to the guidelines that the digital company itself has developed (Quintais/de Gregorio/Magalhães 2023). These rules are often called community guidelines. Community guidelines describe users’ relationships with each other and also the relationship between the user and the platform.[12] These rules do indeed have an effect of creating order because they constitute private communication rules as a partial regime constitution. However, they systematically fall under civil law and have a concrete effect due to the private-law legal relationship (Quintais/de Gregorio/Magalhães 2023). The admissibility of such regulations can in turn be derived from the Basic Law, specifically from the fundamental rights of private autonomy, occupational freedom and freedom of ownership, since the principles enable private individuals to organise private structures and systems within legal parameters (Teubner 2012, 36 ff.; Mast/Kettemann/Schulz 2023, forthcoming).

Use of a platform is only possible after consenting to the terms of use, which include the community guidelines. Thus, a platform usage agreement is typically concluded.[13] The terms of use themselves are regularly incorporated into the contract as standard business terms in accordance with Section 305 (1) sentence 1 of the German Civil Code (BGB).[14] This classification as standard business terms is unproblematic in this respect because the terms of use are unilaterally provided by the company operating the platform for a number of contracts, are pre-worded and are fundamentally not negotiable. Law relating to terms and conditions is therefore applicable to the terms of use. The individual clauses must be able to stand up to a legal check of terms and conditions in their own right. 

In addition to the specific prohibited clauses in Sections 308 f. of the Civil Code, Section 307 (1) sentence 1 of the Civil Code, concerning the principle of good faith, offers the possibility of considering constitutional law valuations as an opening clause under civil law.[15] Case law has taken advantage of this and by way of the doctrine of the indirect third-party effect[16] has established a commitment to fundamental rights for operators of online platforms[17], which must be taken into account when designing terms of use. The fundamental rights would not bind the platforms directly because they are private companies and do not have any state or state-like role.[18] A state-like role is then only to be accepted if a private actor in fact “grows into a comparable obligation or guarantor role that traditionally belongs to the state”[19]. This can only be accepted in the field of communication, in which platforms – and in the future sometimes also metaverses – (will) also move if “private companies themselves take on the task of providing conditions for public communication and thus take on roles that – like ensuring post and telecommunication services – were formerly assigned to the state as a service for the public”[20]. However, such communication provision is not (yet) happening. It cannot be completely ruled out since access to metaverses is much more intensely privately regulated. 

 3.2 Avatars

In metaverses, human actors act via non-physical actants: avatars. From a civil law point of view, the question is how the aspect of merging the real world with the world of the metaverse via an avatar affects issues of attribution under civil law. It is clear that the avatar is the central point of contact for actions in the metaverse. Avatars represent the digital identity of a natural or legal person and are considered an extension of a legal entity (Kaulartz/Schmid/Müller-Eising 2022). The avatar will represent the significant attribution object in the metaverse, which will perform all actions in the metaverse for or by the legal entity (Rippert/Weimer 2007). People ordering items on the Internet are also acting in a technology-mediated way – from this viewpoint, an avatar is no different from an email that can be dressed up. Avatars are able to make declarations of intent in the metaverse, e.g. to purchase concert tickets or other goods and services (Kaulartz/Schmid/Müller-Eising 2022). However, it is not the avatar who is acting, but the person behind it.

Can avatars be liable to prosecution? No, but the people acting through them can be. A post or an email cannot commit an offence, so the focus must shift away from the actant – the avatar – to the person whenever legal responsibility is to be attributed. However, this presupposes the applicability of national criminal law, such as the German Criminal Code (StGB). According to the territorial principle, sovereign powers of punishment are restricted to a state’s own territory (more on the territorial principle: Mills 2006, basis: Schmalenbach/Bast 2017). When determining the location of criminal acts on the Internet, it is recognised that at least such acts committed against a German national or committed by a German national are subject to German criminal law (Schönke/Schröder/Eser/Weißer, no date). This principle can be transferred to the metaverse if there is a possibility of associating the avatar in the metaverse with a specific real person (concurring: Kaulartz/Schmid/Müller-Eising 2022, 521 (529 f.).

In reverse: if an avatar is a victim of an act, e.g. of an insult, does this also always affect the person behind it? That depends: an avatar is a communication medium. I cannot insult a mobile phone, but in the case of an avatar a distinction must be made as to how strong the relationship is between the avatar and the real person behind it. If there is a very strong relationship – for example, an avatar depicts a person’s key characteristics so it can be assumed that the real person is identifiable –, then an insult is likely to be assumed (to the person behind it). 

Some violations of rights, such as murder or conventional theft, cannot be committed against avatars. In the case of other violations of rights, facts other than in real life can be applied. If an avatar is “kidnapped”, for example, someone can be held accountable on the grounds of computer-related violations (hacking). 

May avatars be copied or distorted? That also depends. They have no personal rights of their own but, insofar as a real person is recognisable, the rights of this person prevail. Intellectual property rights may be relevant also for avatars that do not resemble anyone. Not everyone can create an avatar in the form of a Disney character, for example.

4 A METAVERSE FOR EVERYONE?

Despite substantial investment, metaverse technology is neither market ready nor widely deployed. This means it is possible for national and European legislators to introduce appropriate rules to proactively protect individual areas of freedom and reduce negative social consequences. While the law relating to platforms (DSA and DMA) only emerged around 20 years after they gained importance at the beginning of the 21st century, smart metaverse regulation can ensure the effective protection of legal rights. The constitutionalisation process of the platforms through internal juridification and external attribution of responsibility (e.g., through procedural obligations and checks of terms and conditions and also through transparency requirements and risk minimisation obligations) can also be deployed in a targeted manner at a much earlier stage for the metaverse, before its use becomes widespread.

Centralised metaverses present risks for democratic values, for open discourse, for rational self-determination processes. Metaverse operators can abuse their special position of influence on rules and moderation practices. The opening of the platforms and increased criticism from the perspective of democratic theory clearly shows the direction in which the regulatory wind is blowing (Hermann 2022). 

With regard to the metaverse in particular, there are good arguments for developing innovative models for the institutional restriction of power of operators and for improving the legitimacy of social arrangements of the metaverse.  This should happen in close collaboration with all stakeholders involved in digital transformation. As called for by the National Academy of Sciences, “innovative participation ideas must be specifically promoted because established platform operators and service providers are likely to be closely attached to their current business and participation models, and this could hinder the support of democracy-friendly, commercially less usable formats from private commercial sources” (German National Academy of Sciences Leopoldina/acatech – National Academy of Science and Engineering/Union of the German Academies of Sciences and Humanities, 2021, 56). Democracy in the metaverse could thus be supported by initiatives from below and outside. 

The EU has recognised this and has set up comprehensive, Union-wide citizens’ panels. In addition to guidance on good behaviour in the metaverse, these have produced further recommendations and eight basic principles that are intended to apply to the regulation of the metaverse and within the metaverse (Bürgerrat.de 2023). The protection of users has been highlighted here as a particularly important basic principle. Such participatory processes – Meta itself has organised similar events on a global scale – are important preliminary stages in the development of legitimate rules for virtual worlds. An adjustment of regulation would be effective against the backdrop of the Brussels effect, as, accordingly, the EU’s rules have an impact on the international community and encourage other states to adopt the rules or enact similar legislation of their own. Also pointing in this direction are the first preliminary studies by European institutions, including a document by the Committee on Culture and Education[21] and a longer study by the European Parliament Committee on Legal Affairs (2023).

It would also be desirable to have an international legal framework to determine access to and development of metaverses, as the metaverse cannot be thought of on a national level but must be considered global, especially since it is designed for global use. Nor should global regulation exclude issues of international solidarity: access to the metaverse is currently based on privilege. If the metaverse is to develop into a fair and open communication space where rights are supported, perspectives of access for all must also be developed.

BIBLIOGRAPHY

Bürgerrat.de, Virtual worlds for everyone, 24 April 2023, https://www.buergerrat.de/en/news/virtual-worlds-for-everyone/

Celeste/Heldt/Keller (eds.), Constitutionalising Social Media (2022)

Celeste, Digital Constitutionalism (2022)

Sam Jungyun Choi et al. (2023). Regulating the Metaverse in Europe, https://www.globalpolicywatch.com/2023/04/regulating-the-metaverse-in-europe.

De Gregorio, Digital Constitutionalism in Europe (2022).

De Gregorio/Dunn (2022), Common Market Law Review, 473 (488 ff.).

European Commission (no date), European Data Strategy, https://commission.europa.eu/strategy-and-policy/priorities-2019-2024/europe-fit-digital-age/european-data-strategy_en (last accessed: 6 June 2023).

German National Academy of Sciences Leopoldina/acatech – National Academy of Science and Engineering/Union of the German Academies of Sciences and Humanities (p. 56).

Ebers et al. (2021). Multidisciplinary Scientific Journal, 589.

European Commission (2023). Virtual worlds (metaverses) – a vision for openness, safety and respect, https://ec.europa.eu/info/law/better-regulation/have-your-say/initiatives/13757-Virtual-worlds-metaverses-a-vision-for-openness-safety-and-respect_en

European Parliament Committee on Legal Affairs, Metaverse

Hermann, Demokratische Werte nach europäischem Verständnis im Metaverse (2022).

Kettemann/Böck (2023), Regulierung des Metaverse. In Steege/Chibanguza (eds.), Metaverse

Kettemann/Müller (2023), Plattformregulierung, in Steege/Chibanguza (eds.), Metaverse (2023) (forthcoming).

Mast/Kettemann/Schulz (2023, forthcoming). In Puppis/Mansell/van den Bulck (eds.), Handbook of Media and Communication Governance.

Metzger Axel/Schweitzer, Heike (2022). Shaping Markets: A Critical Evaluation of the Draft Data Act. ZEuP, 01, 42.

Mills, International and Comparative Law Quarterly 55 (2006), 1 (13)

Müller/Kettemann (2023, forthcoming). European approaches to the regulation of digital technologies. In Werthner et al. (ed.), Introduction to Digital Humanism (2023)

Quintais/Appelman/Fahy (2022). “Using Terms and Conditions to Apply Fundamental Rights to Content Moderation”, https://ssrn.com/abstract=4286147, 25.

Quintais, João Pedro/De Gregorio, Giovanni/Magalhães, João C. (). How platforms govern users’ copyright-protected content: Exploring the power of private ordering and its implications. Computer Law & Security Review, 48 (Article 105792), 3.

Quintais/de Gregorio/Magalhães, Computer Law & Security Review 2023, 105792, 5.

Schmalenbach/Bast, VVDStRL 2017, 245 (248).


[1] The metaverse is understood in the sense of the EU’s (non-binding) definition as “an immersive and constant virtual 3D world where people interact through an avatar to enjoy entertainment, make purchases and carry out transactions with crypto-assets, or work without leaving their seat” (European Commission’s Analysis and Research Team, Metaverse – Virtual World, Real Challenges, 9 March 2022, p. 3.). Various metaverses exist.

[2] The German National Academy of Sciences Leopoldina also recommends the increased democratic reconnection of platforms. See Leopoldina, the Union of the German Academies of Sciences and Humanities, acatech – National Academy of Science and Engineering (2021): Digitalisierung und Demokratie, https://www.leopoldina.org/uploads/tx_leopublication/2021_Stellungnahme_Digitalisierung_und_Demokratie_web_01.pdf, p. 46.

[3] More on the role of law in the metaverse, with further evidence: Kettemann/Böck (2023, forthcoming) and Kettemann/Müller (2023, forthcoming). More on platform regulation can be found in Müller/Kettemann (2023, forthcoming). This article builds on these explanations and summarises them.

[4] However, consultations have already been carried out, see Choi et al. 2023.

[5] “The European Commission will develop a vision for emerging virtual worlds (e.g. metaverses), based on respect for digital rights and EU laws and values. The aim is open, interoperable and innovative virtual worlds that can be used safely and with confidence by the public and businesses.” (European Commission 2023).

[6] Regulation (EU) 2022/2065 of the European Parliament and of the Council of 19 October 2022 on a Single Market For Digital Services and amending Directive 2000/31/EC (Digital Services Act), OJ L 277, 1.

[7] Spindler, GRUR 2021, 545 (551); Mast, JZ 2023, 287 (289).

[8] Regulation (EU) 2022/1925 of the European Parliament and of the Council of 14 September 2022 on contestable and fair markets in the digital sector and amending Directives (EU) 2019/1937 and (EU) 2020/1828 (Digital Markets Act), OJ L 265, 1.

[9] Regulation (EU) 2022/868 of the European Parliament and of the Council of 30 May 2022 on European data governance and amending Regulation (EU) 2018/1724 (Data Governance Act), OJ L 152, 1.

[10] Proposal for a Regulation of the European Parliament and of the Council on harmonised rules on fair access to and use of data (Data Act), COM(2022) 68 final.

[11] Proposal for a Regulation of the European Parliament and of the Council laying down harmonised rules on artificial intelligence (Artificial Intelligence Act) and amending certain Union legislative acts, COM(2021) 206 final.

[12] Comparisons: Instagram’s Community Guidelines, available at: https://help.instagram.com/477434105621119/?helpref=hc_fnav; more detail at: Mast/Kettemann/Schulz (2023, forthcoming).

[13] Friehe NJW 2020, 1697 (1697); basis OLG Munich NJW 2018, 3115 (3116); concurring: BGH ZUM 2021, 953 (957 f.).

[14] Spindler, CR 2019, 238 (240); Friehe, NJW 2020, 1697 (1697); OLG Munich NJW 2018, 3115 (3116).

[15] Munich commentary Civil Code/Wurmnest, Civil Code Section 307 margin no. 57.

[16] BVerfGE 7, 198 (205 ff.).

[17] BVerfG NJW 2019, 1935 (1936) relating to digital platforms; also: BGH NJW 2012, 148 (150 f.); BGH NJW 2016, 2106 (2107 ff.) relating to liability of hosting services.

[18] BGH ZUM 2021, 953 (961).

[19] BGH ZUM 2021, 953 (960); BVerfG ZUM 2020, 58 (70) with further references.

[20] BGH ZUM 2021, 953 (960); BVerfG ZUM 2020, 58 (70 f.) with further references.

[21] Draft Opinion of the Committee on Culture and Education for the Committee on the Internal Market and Consumer Protection on virtual worlds – opportunities, risks and policy implications for the single market (2022/2198(INI)), 27 April 2023, https://www.europarl.europa.eu/doceo/document/CULT-PA-746918_EN.pdf; with amendments: Amendments 1–64, Draft opinion by Laurence Farreng (PE746.918v01-00), Virtual worlds – opportunities, risks and policy implications for the single market (2022/2198(INI)), 5 June 2023, https://www.europarl.europa.eu/doceo/document/CULT-AM-749262_EN.pdf.

Chances and limits of immersive environments (IU) for anti-discrimination and (historical-)political education

Dr. Deborah Schnabel is a psychologist and has been the director of the Anne Frank Educational Center in Frankfurt/Main since 2022.

The digitalization of the public sphere puts political educators under considerable pressure. With every new technology and app, a new forum emerges that can become a problem for anti-democratic agitation, but also offers the opportunity to make democracy education better for more people. Racism, anti-Semitism, and other threats to democracy are present in all areas of life, and they are quickly conquering the new digital worlds. With every new space, democracy is also renegotiated, and political education is called upon to intervene in this negotiation.

Immersive environments (IU) offer potential to reach more people at a lower threshold and with a different degree of involvement. There is great interest from the entire field of political education to become effective in these new worlds. It may even be possible to preserve important testimonies and witnesses of history by recreating them in virtual realities. Therefore, it is primarily historical-political education, memorial sites, and museums that are launching projects despite the insufficiently explored possibilities of immersive environments. 
Especially memorials feel the pressure to make places of remembrance alive, and of course, especially in the field of Shoah Education: the last eyewitnesses are leaving us. First-hand accounts will no longer be possible. Many actors in political education see Immersive User (IU) as a possibility to reconstruct the authenticity of a contemporary witness encounter. For example, there is the Anne Frank House in Amsterdam, whose digital VR exhibition allows a tour of the family’s hiding place. There is also the app “Inside Auschwitz”, developed by WDR, which combines the accounts of three eyewitnesses with a virtual visit to the memorial.[1] Additionally, there is the project “Dimensions in Testimony” by the USC Shoah Foundation (USC SF), which lets eyewitnesses speak to us as AI-based holograms.[2] There are also the first projects in which students “travel back” in the metaverse to key moments in Black history.[3] What all actors in the field have in common is that they perceive the new possibilities of IU as very important, sometimes invest a lot in the field, have high expectations of the possibilities, but at the same time deal with them rather restrictively. For example, the VR Anne Frank House has no protagonists at all, but instead presents the hiding place furnished, in contrast to the physical exhibition. Other applications only allow certain characters, who usually have a bystander role. Behind this lies the still unanswered question of how to unite the possibilities of new technology with the principles of political education. It will depend on the quality of the answers whether IU will become suitable for widespread use in democracy education and history teaching. 

No matter how it is approached, IU will come up against limits – technological ones (which I will address later), but often also educational-theoretical ones: Is “artificial authenticity” even possible? Does it make pedagogical sense? Or is an immediacy that cannot be reconstructed fetishized here?

HOW REALISTIC CAN REMEMBERING BE?

The limits of immersion are often discussed in such applications, a debate that is older than IU. In memorial education, long before the Internet, there was a debate about how deep the experience of the visitors should go. The main issue was whether such an experience should be accompanied or unaccompanied. In German-speaking contexts, the so-called Beutelsbach Consensus was used as a guideline, which emphasizes the prohibition of overpowering, the requirement of controversy, and student orientation. It is difficult to apply Beutelsbach to IU, as such environments usually overwhelm from their very game logic. In the USA, however, the focus is often on “empathy education”, the development of compassion and understanding, often with methods that are quite “overwhelming”.[4] The problem of “artificial authenticity”, which can lead to distancing or empathy, is obvious: history can potentially be distorted. A purely imagined image of the past emerges that is neither congruent with historical knowledge nor with the victims’ world of experience.

Apart from the inherent pedagogical risks of the technology, the controversy also goes to the content. The biggest point of contention is the possibility for non-victims to play victims of racism or anti-Semitism in IU. What some may view as an opportunity for empathy and a change of perspective can be seen by others as insensitive, cultural appropriation, false alignment of perpetrators and victims, violation of the prohibition of overwhelming and a potential trigger for (transgenerational) trauma. Complex feature compositions are truncated and cannot be explored in depth, especially when, for example, a typical VR session usually does not exceed one hour. The socio-political effect on the dominant society must also not be underestimated:Memory-political problematics and exoneration narratives can be reinforced if the descendants of the perpetrators can feel even more strongly as part of the persecuted through IU.

The balance of distance and empathy that political education has to maintain in IU could be understood as a kind of variant of the “uncanny valley” effect – at a certain level of reality of the simulation it becomes strange, “creepy”. The mentioned example of the WDR app “Inside Auschwitz” illustrates how difficult this balancing act is. On the one hand, it deliberately tries to eliminate the distance: “The users should consciously not take a distanced position but explore the memorial site as if they were visiting it on foot.”[5] On the other hand, the app’s producers warn against too much closeness and even recommend not using VR glasses in a school context, as the lack of distance can disturb students.[6]

It appears that applications are better accepted when they use alienation effects, creating a mixed space that is not real, but not completely fictional either. It is not about identification, but rather sympathy. This seems to apply to the medium as a whole: while VR/AR are still niche products, MMORPGs[7], for example, are functioning excellently as a mass medium. Here, it often appears to be the artificiality of the environment that makes it possible to deal with it confidently. There is never any doubt that one is not in reality; an avatar that does not resemble the user enables greater identification than the clinical perfection of VR/AR, which often does not enable the user to experience their body at all, as when looking down with VR goggles, one does not see their body, but the floor.

It seems almost impossible to consider every potential user and every possible multiple concern. Can’t we operate similarly to our everyday experience – with limited knowledge, but with room to learn? Our daily behaviour is also based on limited knowledge, yet we are getting very far here as well.

Educational science findings on the pedagogical impact of IU are still manageable: mixed results were found for “Anne Frank House VR” compared to desktop learning – while identification with those involved was stronger, at the same time less content was absorbed.[8] Learning dynamics are complex and require more intensive research.

METAVERSE UND CO.

Certainly, the best-known IU application is Mark Zuckerberg’s “Horizon Worlds”, widely known simply as “Metaverse”[9] – known both in terms of ambition and for negative headlines. What is less important for our field is whether “the Metaverse,” i.e. Horizon Worlds has been successful, but it is uncertain whether a similar platform for IU applications could be successful. Civic education must prepare for the possibility that such a platform could become a mass medium. Comparisons can be drawn with the failed “Second Life” platform.

The same questions that apply to social media also apply to the metaverse: Who leads the discourse? Who dominates? Do discriminated people feel safe in the metaverse? The higher degree of reality of IU puts the classic problems of social media in sharper focus. There is still little research on whether IU has an even greater potential for radicalization than social media already does. Heuristically, it is probably safe to assume a similar threat level as social media. Furthermore, adaptive and reactive content in IU will increasingly be produced by AIs, as human content development cannot keep up with the demand for these demanding environments – this is also the view of a contribution by the World Economic Forum, for example.[10]In this scenario, it is conceivable that AI language models could create personalized worlds that essentially affirm users’ attitudes, reinforce them in problematic attitudes, force ideological cocooning effects, and drive them into radicalization tunnels. The already low inhibition threshold in the metaverse would do the rest.[11] These structural aspects of AI racism could have an exponential effect in immersive environments: Racist echo chambers in virtual worlds would inevitably leak outward, shaping society – and in turn serving as a source of AI content generators.

The likelihood of this scenario also depends on the people who develop the metaverse. They would need to be diversity- and discrimination-sensitive from recruitment through beta testing, in all areas of work. Experience has shown that this cannot be achieved through decree, but through the active participation of the people involved. The question of who uses the metaverse and why has not yet been answered. What inhibitions might exist for certain parts of society to use the Metaverse? Why might this be the case? Do corporations like Meta actively counteract such tendencies? What is being done to provide representation and protection? How do Meta and Co. prevent right-wing infiltration, as can also be observed on other digital platforms that operate more in the niche?

POLITICAL EDUCATION IN THE METAVERSE

Apart from these questions, this task arises in our field of action: How should civic education respond to these and similar questions? In our opinion, the question should not be whether political education should now “invest in the metaverse.” Such a generalist approach is doomed to fail. However, it should definitely be experimented with – the metaverse should be seen as software in which concrete, limited settings are tried out. Thus, the maxim should not be “We’re putting our whole memorial into the Metaverse,” but rather, “This specific exhibition can (also) be visited in the Metaverse.”

The potential of such steps can already be observed today, for example in Minecraft. For example, a library of black literature was saved not only to Meta’s “Horizon Worlds” but also to a Minecraft server. One explanation for Minecraft’s success could be found in the “uncanny valley,” as it does not try to imitate reality. There is currently a project underway to transfer the Yad Va-Shem memorial to Minecraft[12], as well as private projects that strive to reflect the Holocaust on the platform[13]. However, this should not hide the fact that Minecraft is also used by the opposite side: users often point out servers on which Nazis glorify Nazi atrocities.

EXCURSUS: AI, DISCRIMINATION AND JUSTICE

The increasing importance of AI in various environments is platform-independent and requires its own analysis. The language used to discuss “artificial intelligence” promises a revolution in all aspects of life, from the world of work and macroeconomics to literature and art. At the same time, a promise of justice based on procedural rationality and the absolute neutrality of the algorithm is being made. Exclusions resulting from misanthropic attitudes can thus be avoided automatically – the assessment is made on the basis of an alleged objectivity that cannot be achieved by humans.

In fact, the publicly known language models, first and foremost ChatGPT, do not seem to be suitable for reproducing discriminatory language – ChatGPT in particular stands out here for its robust ethical framework. ChatGPT strongly opposes racism and discrimination. All attempts to use the software for the production of racist texts are supposed to fail theoretically already in the approach (although this can be circumvented with the appropriate Prompts). It should not be forgotten that there are projects that specifically train language models to combat misanthropy, such as the international research project “Decoding Antisemitism”[14].

At the same time, there are increasing reports that structural racism is being reinforced by intelligent image recognition software. The algorithm bases its decisions on patterns it finds in society, and evaluates non-white people differently than white people. This means that racism is avoided on a linguistic level, while it is further perpetuated, automated, and industrialized on an operational level, usually without transparency for those affected. If AI is used to make decisions about credit and housing allocation, job applications, and social forecasts in the future, the problems are foreseeable. Unlike human decisions, the apparent objectivity of the machine still makes it immune to criticism.

The framework conditions of the AI industry also need to be critically examined – for example, the underpaid Kenyan workers whose performance was essential for the success of ChatGPT.[15] On an economic level, AI reproduces capitalism instead of critically questioning it, and this is done along racial distinctions.Who owns the systems? Who is allowed to market their services, and who benefits from the increased productivity? Who owns AI art, whose creation is inspired by the names of famous artists? Who can work in this new AI world, whose job may be replaced by automation, and whose life may be managed by A.I.? All indications suggest that those affected by racism will not receive positive answers to these questions, and that the negative effects of the revolution will disproportionately affect them.

Furthermore, the implications for anti-Semitism should not be underestimated. Already, the portrayal of Jewish people is subject to a historical bias – image-creation programs draw primarily on historical representations and images of ultra-Orthodox groups. As a result, Jewish people, who today participate in modern life as a matter of course, are absent from the collective imagination.

Last but not least, Artificial Intelligence (AI) systems can already be used to generate an endless stream of fake news, deep fakes, and conspiracy theory content – with a plausibility and apparent authenticity that current conspiracy ideologues can only dream of. In all likelihood, the effects of this will be felt particularly strongly by Jewish people.

All these questions arise with increased urgency when AI is not just a nice toy within Information Units (IUs), but is instead constitutive of them. How will educational IUs change when they no longer present content that is prefabricated by humans, but instead AI dynamically generates such content? What will happen when AI-based avatars target users and radicalize or intimidate them with personalized propaganda? Where is the control? And how can automated patterns of reproduction of anti-human attitudes be interrupted?

This requires asking questions that naturally point to the need for further exploration. Beyond the field of IU in civic education, namely: what conditions for success do we need for a discrimination-sensitive AI ethic? How can AI be taught to recognize power asymmetries? Here, limits become apparent that are already inherent in the current way of working of AI – it is fundamentally reproducing.

A COUNTER-MODEL: IU AS LIVED UTOPIA

Perhaps it is not expedient to reconstruct stories of persecution and experiences of discrimination for political education in IU. Perhaps it would make sense to simulate worlds in IU in which a certain utopian hope has become real. Instead of putting learners into a pedagogical capsule, with certain preconceived content and learning goals, they could independently explore worlds that have already been liberated. The learning effect would arise precisely from the absence of a context of violence that is now commonplace.

An example of this is the everyday simulation game “The Sims” (released in 2000) and its role in promoting homosexual/queer emancipation. In this PC game, queer relationships could be experienced for the first time in a mainstream title without it being a constant theme. Many queer people today report having experienced queer representation and empowerment for the first time simply because of the possibility that was not imposed or used as a teaching tool, but was explored and motivated intrinsically.

Perhaps political education for IU should take a similar approach: provide a concrete utopia that can be explored, with whose possibilities users can experiment freely – and in the process reduce fears and prejudices. The virtual world should be a model for reality, not the other way around.

This approach would possibly have the advantage of avoiding the numerous pitfalls mentioned above that threaten political education in IU from the outset. The boundaries of the user experience would also have to be drawn less tightly, there would be less need for pedagogical guidelines, and less top-down control overall.


[1] Planet Schule: Inside Auschwitz | Erzählungen Überlebender und 360 Grad-Videosa (10.12.22), URL: https://www1.wdr.de/schule/digital/unterrichtsmaterial/dreisechzig-inside-auschwitz-100.html (Stand: 12.09.23).

[2] N.N.: Dimensions in Testimony. Speak with survivors and other witnesses to the Holocaust and other genocides through their interactive biographies. (o.J.), URL: https://sfi.usc.edu/dit (Stand: 12.09.23).

[3] N.N.: Education in the Metaverse takes students back in history (01.12.23), URL: https://www.wbur.org/hereandnow/2022/12/01/black-history-metaverse (Stand: 12.09.23).

[4] Besand, A., Overwien, B., & Zorn, P. (Hrsg.) (2019): Politische Bildung mit Gefühl. Bundeszentrale für politische Bildung, Bonn.

[5] Verena Lucia Nägel, Sanna Stegmaier: AR und VR in der historisch-politischen Bildung zum Nationalsozialismus und Holocaust – (Interaktives) Lernen oder emotionale Überwältigung? (08.10.2019), URL: https://www.bpb.de/lernen/digitale-bildung/werkstatt/298168/ar-und-vr-in-der-historisch-politischen-bildung-zum-nationalsozialismus-und-holocaust-interaktives-lernen-oder-emotionale-ueberwaeltigung/ (Stand: 12.09.23). 

[6] “The stories told by the witnesses and the images from the camp can be challenging for sensitive students or young people with their own experience of war and flight. It can help to watch the videos with a little more distance (for example, over the shoulder of a classmate holding the tablet). In addition, it is advisable not to use headphones or VR.” Planet Schule: Inside Auschwitz | Erzählungen Überlebender und 360 Grad-Videosa (10.12.22), URL: https://www1.wdr.de/schule/digital/unterrichtsmaterial/dreisechzig-inside-auschwitz-100.html (Stand: 12.09.23).

[7] Massively Multiplayer Online Role-Playing Games, online video game worlds with many thousands of users.

[8] Miriam Mulders: Learning about Victims of Holocaust in Virtual Reality: The Main, Mediating and Moderating Effects of Technology, Instructional Method, Flow, Presence, and Prior Knowledge.(06.03.23), URL: https://www.mdpi.com/2414-4088/7/3/28 (Stand: 12.09.23).

[9] The term “metaverse” in the narrower sense is theoretically meant to encompass all immersive environments and their interconnection – in this sense, “the metaverse” does not yet exist, but individual IU Horizon Worlds, Minecraft, etc. do. ChatGPT offers the following definition of the metaverse (prompt: write a scientific definition of the metaverse): „The Metaverse is a comprehensive, large-scale virtual environment that integrates multiple interconnected digital spaces, facilitating seamless interactions, communication, and collaboration among users, who are represented by digital representations known as avatars. This virtual framework merges aspects of virtual reality, augmented reality, and interconnected networks to create an immersive and persistent ecosystem for users, with wide-ranging applications spanning entertainment, social interaction, education, commerce, and various other domains. By transcending physical limitations and geographical boundaries, the Metaverse fosters novel opportunities for creative expression, innovation, and interaction in the digital realm.“

[10] N.N.: AI is shaping the metaverse – but how? Industry experts explain” (9.05.2023). URL:
https://www.weforum.org/agenda/2023/05/generative-ai-and-how-can-it-shape-the-metaverse-industry-experts-explain/ (Stand: 12.09.23)

[11] cf. only Yinka Bokinni: A barrage of assault, racism and rape jokes: my nightmare trip into the metaverse. (25.04.22.) URL: https://www.theguardian.com/tv-and-radio/2022/apr/25/a-barrage-of-assault-racism-and-jokes-my-nightmare-trip-into-the-metaverse (Stand: 12.09.23)

[12] N.N.: Yad Vashem comes to Minecraft for International Holocaust Day. (26.01.23) URL: https://www.i24news.tv/en/news/israel/technology-science/1674714331-yad-vashem-comes-to-minecraft-for-international-holocaust-day (Stand: 12.09.23)

[13] Jcirque25: Using Minecraft as a medium to reflect on the Holocaust (2020). URL: https://www.reddit.com/r/Minecraft/comments/htlst6/using_minecraft_as_a_medium_to_reflect_on_the (Stand: 12.09.23)

[15] Billy Perigo: OpenAI Used Kenyan Workers on Less Than $2 Per Hour to Make ChatGPT Less Toxic (18.01.23). URL:https://time.com/6247678/openai-chatgpt-kenya-workers/ (Stand: 12.09.23).

Decentralized Autonomous Organisations (DAOs) in the metaverse: the future of extremist organisation?

Julia Ebner is a Senior Research Fellow at the Institute for Strategic Dialogue (ISD), where she has led projects on online extremism, disinformation, and hate speech. Based on her research, she has given evidence to numerous governments, intelligence agencies, and tech firms, and advised international organizations such as the UN, OSCE, Europol, and NATO. Ebner’s first book, The Rage: The Vicious Circle of Islamist and Far-Right Extremism, won the Bruno-Kreisky Award for Political Book of the Year 2018 and has been disseminated as teaching material across Germany by the Federal Agency for Civic Education. Her second bestselling book, Going Dark: The Secret Social Lives of Extremists, was awarded the “Wissenschaftsbuch des Jahres 2020” Prize (“Science Book of the Year 2020”) by the Austrian Federal Ministry for Education as well as the Caspar Einem Prize 2022. Her new book, Going Mainstream: How Extremists Are Taking Over, was published in June 2023. Julia has just completed her DPhil in Anthropology at the University of Oxford.

INTRODUCTION

The transition from Web 2.0 to Web 3.0 is likely to leverage the latest technological advances including AI, machine learning and blockchain technology. This might not only bring about a new interplay of the physical, virtual and augmented reality but could also fundamentally change the ways in which corporate entities, social communities and political movements organise themselves. 

Decentralisation is a key property of the Metaverse. Policies, legal contracts and financial transactions that were traditionally the domain of governments, courts and banks might be replaced with smart contracts and financial transactions in blockchain. Cryptocurrencies can be used as a new medium of exchange outside of established banking systems, while non-fungible token (NFT) might serve as unique digital identifiers to certify ownership and authenticity. Taken together, these new forms of self-governance could lead to the explosion of so-called Decentralized Autonomous Organisations (DAOs) in the Metaverse.

Decentralized Autonomous Organisations (DAOs) are digital entities that are collaboratively governed without central leadership and operate based on blockchain (Jentzsch, 2016). As such, DAOs allow internet users to establish their own organisational structures, which no longer require the involvement of a third party in financial transactions and rulemaking. DAOs allow online communities to simplify their transactions and use a community-based approach to establish rules (World Economic Forum, 2023). However, as this study will explore, they might also give rise to new threats emerging from decentralised extremist mobilisation, pose a risk to minority rights, challenge the rule of law, and disrupt institutions that are currently considered fundamental pillars of our democratic systems.

RESEARCH AIM AND METHODS

The aim of this research paper is to explore potential ways in which DAOs could be exploited by extremist and anti-democratic actors. To what extent are extremist movements incentivised and able to make use of such new forms of self-governance? What are the types of threats that might emerge from anti-democratic or anti-minority DAOs? 

To make progress on these research questions, the project reviewed existing literature and used a combination of expert interviews and digital ethnographic research. More specifically, qualitative interviews were carried out with two leading experts on DAOs, a manual review of roughly 350 existing DAOs was performed, and exploratory digital ethnographic research was carried out across the extremist fringe platforms Odysee, Bitchute and Gab as well as the encrypted apps Telegram and Discord. 1All quoted and referenced primary source content was documented and archived in the form of screenshots and can be made available to fellow researchers upon request.

The report will first provide an overview of DAOs, their relationship with the Metaverse and current use cases. It will then assess potential areas of misuse before providing an outlook of future trends, including key challenges and opportunities, and ideas for future research areas. 

DAOS AND THE METAVERSE

The next iteration of the internet, the Metaverse, is inherently tied to decentralised forms of communication, collaboration and finance. The metaverse is built on blockchain technology. Researchers have pointed out that collective governance and decentralised finance will likely be a key characteristic of the Metaverse (Chao et al, 2022; Goldberg & Schär, 2023; Tole & Aisyahrani, 2023). Ownership over an asset or “a piece of land” in the Metaverse would be governed by NFTs (Laeeq, 2022). 

The World Economic Forum described DAOs as “an experiment to reimagine how we connect, collaborate and create” (World Economic Forum, 2023, 27). DAOs operate differently from traditional organisations in their allocation of resources, coordination of activities and decision-making processes. Code-driven and community-oriented structures allow stakeholders to be directly involved in governance and operations via voting systems (World Economic Forum, 2023). Decentraland is the first large-scale virtual world based on the architecture and premises of a DAO. With the aim of distributing power among its participating users, it operates on the Ethereum blockchain. The “residents” of Decentraland can initiate policy updates based on decentralised voting systems that are embedded in Decentraland DAO governance structure (Goldberg & Schär, 2023, 3).

There is an active ongoing debate about the legal status of DAOs. Proponents of DAOs have argued that smart contracts are self-executing codes on blockchain that can operate independently of legal systems. They point to lower transaction costs and reduce the need for intermediaries. However, more skeptical commentators have suggested that DAOs should be owned and/or operated by humans (Jentzsch, 2016, Mondoh et al, 2022). A more detailed discussion of existing efforts to regulate DAOs can be found in the final section of this report. 

To date, research into DAOs and the Metaverse remains scarce. However, in recent years there has been a notable rise in publications that investigate the trends and implications of DAOs for various sectors. For example, Chao et al. (2022) discuss the advantages and disadvantages of DAOs for non-profit organisations. While they argue that DAOs provide a “strong democratic system”, they also caution that they “are not subject to legal, physical, or economic constraints” and are therefore capable of operating “outside the control of a single central authority or a single governing body” (Chao et al, 2022). 

Mondoh et al (2022) discuss whether DAOs might be the future of corporate governance, while Goldberg and Schär (2023) write that DAOs could disrupt the monopoly market structures in the technology sector. Their assessment is that “open standards and blockchain-based governance are a necessary but not a sufficient condition for a decentralized and neutral platform” (Goldberg & Schär, 2023). Meanwhile, Tole and Aiyahrani (2023) predict that the Metaverse and DAOs have the potential to revolutionise the education system. Through DAOs, they claim, “technology courses, certificates, and more can become automated and authenticated on the blockchain” (Tole & Aisyahrani, 2023, 1). DAOs could provide the basis for what they call “Metaversity”, whereby the necessary infrastructure is provided for decentralised learning centers that have incentive structures and tailored courses in place (Tole & Aisyahrani, 2023, 3).

CURRENT USE OF DAOS

The current DAO ecosystem is highly diverse and growing at fast pace. By 2023, DAOs collectively manage billions of dollars and count millions of participants (World Economic Forum, 2023; DeepDAO, 2023). DAOs exist across a range of industries such as finance, philanthropy and politics. A manual review of the 356 DAOs listed on the website Decentrallist.com at the time of writing (July 2023), illustrates the highly diverse nature and purpose of DAOs. The descriptions and stipulated mission statements include social, activist, investment, gaming, media and collector aims (Decentralist, 2023). Some DAOs are clearly satirical in nature, including the Café DAO which aims “to replace Starbucks”, the Doge DAO which wants to “make the Doge meme the most recognizable piece of art in the world” and the Hair DAO, “a decentralised asset manager solving hair loss”.


By 2023, DAOs have primarily attracted libertarians, activists, pranksters and hobbyists. However, there are also activist and political DAOs. For example, Decentralist.com lists DAOs that are tied to the climate movement and Black Lives Matter community as well as social justice and anti-banking DAOs that want to tackle social inequality. Figure 1 provides an overview of the most common DAOs found on Decentralist:

Figure 1: Overview of most common existing DAOs

Figure 1: Overview of most common existing DAOs

While the review of existing DAOs did not identify any openly anti-democratic or anti-minority DAOs, some DAOs make use of alt-right codes and conspiracy myth references, or express support for far-right populist leaders. For example, the “Redacted Club DAO” claims to be a secret network with the aim “slaying” the “evil Meta Lizard King”. The official Discord chat of the “Redacted Club DAO”, which counts 850 members, contains Pepe the Frog memes and references to “evil lizards” and “good rabbits”. Another DAO, which is not listed on Decentralist, is called “Free Trump DAO”. Its Twitter account describes it as being “for Patriots” and serving as “a powerful tool that fights for freedom and liberty around the world.” Its Telegram channel, which counts 474 members, is filled with Trump-glorifying memes, MAGA symbols and announcements such as “We’re supporting Trump. Bring back Trump”. There is also the “Trump DAO – the 47th US President”, which claims to want to “raise money through the crypto to support Trump for 2024”. The aim is to shape political campaigns by fundraising and giving complete anonymity to people who want to support Trump.

EXTREMIST INCENTIVES TO EXPLOIT DAOS

This sub-chapter used ethnographic research in extremist forums and chat groups to assess their conversations about DAOs. The analysis was performed on the far-right fringe platforms Bitchute, Odysee and Gab as well as the encrypted apps Discord and Telegram. An exploratory discourse analysis was used, as the volume of content that is specifically focused on DAOs is still limited in extremist communities and therefore does not provide enough data points for statistical analysis. 

A total of 85 pieces of content were identified across the analysed far-right fringe platforms, 46 on Odysee, 23 on Bitchute and 16 on Gab. Additionally, 100+ channels related to DAOs were detected on Telegram and Discord, including far-right extremist channels. While many of the posts on Odysee, Bitchute and Gab simply explain the characteristics and advantages of DAOs and how they might shape the Web 3.0, some of the detected conversations also reveal that there is much appetite for decentralised alternative forms of collaboration, communication and crowdfunding. Broadly speaking, the intent of far-right activists to use DAOs can be divided into two overarching types of incentives: practical and ideological motivations. 

Important practical reasons are that DAOs can help users to circumvent monitoring, regulatory mechanisms and traditional institutions. For example, extremists may view them as useful to escape surveillance by security services, avoid perceived censorship by tech firms and find an alternative to frozen bank accounts. Many extremist and even terrorist movements already created their own cryptocurrencies and make use of anonymous bitcoin wallets. In particular, non-transparent cryptocurrencies such as Monero served extremists whose bank accounts have been frozen. Jihadists used cryptocurrencies as early as 2016 to fund violent activities (Irwin & Milad, 2016). In 2019, Treasury Secretary Steve Mnuchin warned that cryptocurrencies pose a national security threat, allowing malicious actors to fund criminal activities (Rappaport & Popper, 2019). While decentralised finance is already a trend among extremists, a shift to entirely decentralised forms of self-governance could be the next step (Krishnan, 2020). For example, Gab users highlighted that DAOs can help organisations to “unlock their full potential and usher in a new era of decentralized governance.” The post continued: “Embrace the power of automation and embark on a journey of efficient and transparent decision-making within your DAO.” Another one shared that DAOs may not be “capable of being subject to sanctions”. 

There are also ideological incentives that might lead extremists to use DAOs for their purposes. In particular, fundamental distrust in the establishment means that DAOs can be an appealing alternative. Users who believe that the “deep state” or the “global Jewish elites” control everything from governments and big tech to the global banking system might prefer to set up their own technological, financial and logistical networks. For example, QAnon-related groups on Telegram were discussing the future of decentralized finance and how this might be an escape route to evade the U.S. federal banking system. DAOs also fit ultra-libertarian utopian visions of online worlds that are entirely unregulated, in which speech and actions are “truly free”. “FREEDOM Meta-DAO” declared in its mission statement on Discord: “We believe in freedom of speech, privacy, and protection from cancel-culture bullying. Using a DAO as a service platform, we bring society together as a whole by removing borders and adaptation blockages.” Finally, some users may be motivated by the outlook of creating their own digital state that operates based on their own rules, values and ideologies. 

Figure 2 provides an overview of the practical and ideological incentives:

Practical IncentivesIdeological Incentives
Safe haven from surveillance by security services or journalistsConspiracy myths about the establishment and financial institutions
Escape route from social media removal policiesUltra-libertarianism and desire for unlimited free speech
Financial solution to frozen bank accountsCreation of digital state or corporation according to own rules
Figure 2: Overview of practical and ideological incentives

Most of the posts about DAOs on far-right websites were positive, however there were also a few warnings among the analysed pieces of content. For example, one user on Gab wrote that “though Blockchain technologies make traditional authoritarianism less likely, they make a new kind of authoritarianism, born of decentralized autonomous organizations (DAOs), more likely.” The user continued: “By design and accident, DAOs will tend to develop into the computational equivalents of eusocial colony animals such as ants, bees, and termites. Once formed into such superorganisms, DAOs will exhibit emergent behaviors like swarming and collective intelligence.” Indeed, the user pointed out that one of the risks is that “humans venturing into the DAOs’ native habitat would then find themselves forced to live under the arbitrary will of not another human, but instead of a vast, mysterious hoard of nonhuman, and perhaps inhuman, entities.” 

EXTREMIST CAPABILITIES TO EXPLOIT DAOS

In this section an analysis of potential areas for exploitation was performed based on a review of literature and interviews with two leading technology experts. The first interviewee was Carl Miller, the Director of the Centre for the Analysis of Social Media (CASM) at the think tank Demos who has long warned of potential threats emerging from the misuse of DAOs. The second interviewee was Christoph Jentzsch, a leading developer of the blockchain Ethereum and head behind “The DAO”, which became one of the largest crowdfunding campaigns in history, raising over $150 million after launching in April 2016 via token sale, but was hacked soon later. 

This sub-chapter addresses questions such as: Could trolling armies start cooperating via DAOs to launch election interference campaigns? What happens if anti-minority groups establish their own digital states in which they impose their own governing structures? Finally, how might terrorists leverage DAOs to fund and plot criminal activities and violent attacks? 

DAOs that can be used by movements to further their social, political and criminal objectives. Three potential threats were identified related to extremist and terrorist use of DAOs: 1.) influence campaigns, 2.) radicalisation of sympathisers, and  3.) attacks on political opponents. In all three potential threat areas, DAOs can be exploited in at least three ways on a tactical level: a.) by coordination and planning activities, b.) by crowdfunding and purchasing activities, and c.) by radicalisation and training activities. As such, they could change the nature of rebellion movements as well as their relative position of power, effectiveness and resilience to governmental countermeasures (Krishnan, 2020). 

Figures 3 and 4 summarise the potential threats as well as potential tactical exploitation areas associated with the use of DAOs by extremists and terrorists:

Figure 3: Overview of potential tactical areas for exploitation 

Influence Campaigns

One risk is that DAOs might be used in the future to carry out large-scale influence and election interference campaigns. Carl Miller told me that “beyond the speculative activities around crypto and NFTs, the deeper simmering experimentation around governance provided more fundamental challenges to the state and security.” DAOs allow extremist groups to engage in deliberative and collective governance and decision-making in ways where, according to Carl, it is a.) very unclear who is doing it, and b) hard for statutory security agencies to do anything about it. “As we gear up for the next U.S. presidential election, we might see DAOs being used for election campaigns,” he warns. For example, Trump DAO could incentivise people to donate and participate in campaigns. “One could imagine a campaign being funded by tens of thousands of dark wallets that do not have clear links to real-world identities.” Would this be recognised as electoral campaigning? Carl noted that the regulatory space in U.S. currently regulates at wallet level, not at DAO levels. However, these wallets can be fully anonymous. “There are already crypto-based gig sites where small jobs can be done anonymously based on crypto remuneration,” Carl said. “In the future we might see gig workers being paid to run disinformation campaigns or stage protests.”

Radicalisation of Sympathisers

Another risk is that DAOs could facilitate radicalisation efforts undertaken by extremist groups. Carl noted that “the basic problem of extremists is that they are being denied the conventional, easiest ways of reaching people”. For example, they find it hard to rent halls, to have Facebook groups, to raise money or get the word out. He continued “You could easily see a world where there are protocol-based alternatives or replacements for conventional organizational structures”, allowing extremist movements to surmount collective action problems, create international coalitions and fundraise for their activities.

Extremist movements could potentially even create their own digital states, for example in the form of digital white ethnostates or cyber caliphates. In 2022, the former chief technology officer at Coinbase Balaji Srinivasan described in his book The Network State how DAOs could soon give rise to new forms of digital statehood. Any group of online users could decide to start their own country, with their own laws, social services and financial transactions (Srinivasan, 2022). 

Attacking Political Opponents 

Finally, DAOs might also serve as safe havens for the planning and plotting of violent terrorist attacks and cybercrimes. The circumvention of government regulation and monitoring activities could make them particularly useful for violent extremist and criminal organisations. Moreover, white nationalist movements have long been advocating for decentralised structures and so-called “leaderless resistance” (Michael, 2012; Malone et al, 2022). Increasingly, jihadist organisations such as Islamic State and Al Qaeda have adopted similarly decentralised models of leadership. Researchers have argued that leaderless organisations are more resilient to disruptions and interventions than hierarchical organisations. They compared hierarchical organisations to spiders who will die when you cut off their head, while the equated leaderless organisations with starfish whose legs will regrow when you cut them off (Brafman & Beckstrom, 19-29). 

According to Christoph Jentzsch, the most notable difference between DAOs and traditional forms of organisations is that, unlike associations or co-operatives, DAOs cannot be outlawed and their assets or bank accounts cannot be frozen. “If you can get organised without relying on traditional infrastructure, this also means that you won’t be controlled by traditional institutions. It would be difficult to intervene on a statutory level”.  Jentzsch argues that cash already provides an alternative route for clandestine groups to fundraise, hold or spend money. Yet, DAOs might make it easier to conduct international, anonymous operations.

DAOs could give rise to new forms of the dark market. It might even be possible for DAOs to facilitate an anonymous assassination market (Krishnan, 2020). Carl noted that there would probably be law enforcement responses to terrorist plots and other serious criminal activities, which could include “a mix of infiltration and subversion, and perhaps direct cyber offensive activity.” DAOs would be resilient in some ways because they are decentralised and rely on smart contracts. However, they might also be more vulnerable to hacking offenses. “It’s difficult for creators to know when they are safe from a hacking attack due to their complex structures,” Carl explained.  

CONCLUSIONS, FUTURE RESEARCH AND CHALLENGES ON THE HORIZON

DAOs bring both a range of challenges and opportunities for democratic culture in cyberspace. Advocates of DAOs have argued that these blockchain-based forms of self-governance promise enhanced security, transparency and trust and reduce transaction costs arising throughintermediaries. Meanwhile, this paper has explored some of the key challenges and risk areas for national security and democracy. 

The potential exploitation of DAOs for extremist or criminal purposes has not received enough attention in the research and policy communities. This report identified a range of ways in which DAOs might be misused by extremist movements in the future, which could challenge the rule of law, pose a threat to minority groups, and disrupt institutions that are currently considered fundamental pillars of democratic systems. More specifically, the study explored how extremist movements might tap into DAOs to plan, coordinate and launch influence and interference operations, radicalisation campaigns and violent attacks. 

Risks associated with the misuse of DAOs for extremist and criminal purposes has not been on the radar of global policymakers. Many governments have started to develop or pass legal frameworks to regulate AI. However, few countries or regions have even recognised the existence of DAOs or considered regulating them. Technology expert Carl Miller said that “even though DAOs behave like companies, they are not registered as legal entities”. There are only a few exceptions: The U.S. States Wyoming, Vermont and Tennessee have passed laws to legally recognise DAOs. 

This report is a first attempt to understand the potential risk areas of DAOs in the Metaverse. While this study focused on non-state actors, exploring emerging threats from hostile state actors might be another important area for future research. Studies could also use experimental methods and interviews with policymakers and law enforcement to map the threats landscape. “DAOs can be used by anyone: by charities, investment funds, but yes, they can also be used by terrorist organisations,” Christoph Jentzsch argues. However, he believes that the positive cases of use significantly outweigh the negative ones. Future studies should investigate both the potential opportunities and challenges related to collective decision-making and self-governance, diversity and the protection of minorities, radicalisation and extremism in decentralized communities in the Metaverse. The positive ways in which DAOs can shape future democratic culture are just as poorly understood as the negative impact they could have on politics and society. 

BIBLIOGRAPHY

Brafman, Ori and Rod A. Beckstrom, The Starfish and the Spider: The Unstoppable Power of Leaderless Organizations (London: Penguin, 2007).

Chao, Chian-Hsueng, I-Hsien Ting, Yi-Jun Tseng, Bing-Wen Wang, Shin-Hua Wang, Yu-Qing Wang, and Ming-Chun Chen. “The Study of Decentralized Autonomous Organizations (DAO) in Social Network”. Proceedings of the 9th Multidisciplinary International Social Networks Conference, 2022, 59-65.

Decentralist. “The List of DAOs”. 2023. Available online: https://www.decentralist.com/list-of-daos. 

DeepDAO. “Organziations”. 2023. Available online: Deepdao.io/organziations.

Goldberg, Mitchell, and Fabian Schär. Metaverse governance: An empirical analysis of voting within decentralized autonomous organizations”. Journal of Business Research 160 (2023).

Irwin, Angela S.M., and George Milad. “The Use of Crypto-Currency in Funding Violent Jihad”. Journal of Money Laundering Control 19, no. 4 (2016): 411.

Jentzsch, Christoph. “Decentralized autonomous organization to automate governance”. White Paper, 2016.

Krishnan, Armin. “Blockchain Empowers Social Resistance and Terrorism Through Decentralized Autonomous Organizations”. Journal of Strategic Security 13, no. 1 (2020). 

Laeeq, Kashif. “Metaverse: Why, How and What”. Presentation, February 2022. Available online: https://www.researchgate.net/publication/358505001_Metaverse_Why_How_and_What. 

Malone, Iris, Lauren Blasco, and Kaitliyn Robinson. “Fighting the Hydra: Combatting Vulnerablities in Online Leaderless Resistance Networks”. National Countertrerorism, Innovation, Technology and Education Center, U.S Department of Homeland Security, September 2022. Online: https://digitalcommons.unomaha.edu/ncitereportsresearch/24/.

Michael, George. Lone Wolf Terror and the Rise of Leaderless Resistance (Nashville: Vanderbilt University Press, 2012).

Mondoh, Brian Sanya and Johnson, Sara M. and Green, Matthew and Georgopoulos, Aris. “Decentralised Autonomous Organisations: The Future of Corporate Governance or an Illusion?”. 23 June, 2022.

Rappaport, Alan and Nathaniel Popper, “Cryptocurrencies Pose National Security Threat, Mnuchin Said,” New York Times, July 15, 2019. Available online: https://www.nytimes.com/2019/07/15/us/politics/mnuchin-facebook-libra-risk.html.

Srinivasan, Balaji. The Network State: How to Start a New Country (Self-published book, 2022).

Sutikno, Tole, and Asa Ismia Bunga Aisyahrani. “Non-fungible tokens, decentralized autonomous organizations, Web 3.0, and the metaverse in education: From university to metaversity”. Journal of Education and Learning 17, no. 1 (2023), 1-5.

Venis, Sam. “Could New Countries be Founded – On the Internet?”. The Guardian, 5 July 2022. Available nline: https://www.theguardian.com/commentisfree/2022/jul/05/could-new-countries-be-founded-on-the-internet.

World Economic Forum. “Decentralized Autonomous Organizations Toolkit”. Insight Report, January 2023. Available online: https://www3.weforum.org/docs/WEF_Decentralized_Autonomous_Organization_Toolkit_2023.pdf.

  • 1
    All quoted and referenced primary source content was documented and archived in the form of screenshots and can be made available to fellow researchers upon request.

A safe space for everyone – a plea for a democratic and participative metaverse

Dr Octavia Madeira, Institute for Technology Assessment and Systems Analysis (ITAS), Karlsruhe Institute of Technology (KIT)

Dr Georg Plattner, Institute for Technology Assessment and Systems Analysis (ITAS), Karlsruhe Institute of Technology (KIT)

1 MALEVOLENT ACTORS IN THE METAVERSE

The vision of a metaverse presented by Meta CEO Mark Zuckerberg in October 2021 was a watershed moment for society and for the tech world. Although the concept of a second digital life, including a digital identity, is neither new nor exclusive to Meta (see, for example, Second Life), this was the first time that almost all the possible functionalities of the metaverse had been presented up to that point. Here, there is a special emphasis on immersion via virtual reality which, as an extension of today’s Internet applications, is meant to give users a completely new sense of participation and let them experience the metaverse in a multimodal and multi-sensory way. The presentation of the vision also focused in particular on technological permeability, on the diffusion of social media in all areas of human life and therefore on the displacement of social media as a pure entertainment platform.

Should the metaverse turn out to be as Zuckerberg and other proponents envision it, this would mean a radical transformation of social interaction with the digital space, and also a radical change in our everyday lives. Shopping could increasingly shift to the metaverse as an immersive experience, sports classes could take place in a virtual environment and virtual church services could be held with believers from all over the world. The world of work has already permanently changed, partly due to the coronavirus pandemic – and we could soon move from working at home to working in the meta-office.

But these innovations will not only change our everyday lives – they will also cause extremism and radicalisation to strike out in new directions and transform to adapt to new environments. Generally speaking, extremists use technologies that are cheap, readily available, easy to use and widely accessible for their purposes, like propaganda, communication and recruitment. Using technology for a function other than that intended by the developers with the intention of doing harm to others is an inherently creative process. Cropley, Kaufman and Cropley (2008) call this “malevolent creativity”. They define it as a form of creativity that “is deemed necessary by some society, group, or individual to fulfill goals they regard asdesirable, but has serious negative consequences for some other group, these negative consequences being fully intended by the first group” (106). We describe actors who display malevolent creativity (such as extremists or spreaders of fake news) as malevolent actors.

In the past, malevolent actors were very creative especially when it came to realigning their own organisation and distributing their own ideology. The digital revolution has equipped them with an unprecedented number of tools with which to further their cause: from (encrypted and instant) mass communication for propaganda and recruitment to alternative instruments for financing operations and logistics through to new means of destruction and terror. Recent technological advances have opened up a wide range of new opportunities for malevolent actors. For example, Web 2.0, the rise of social media and the availability of nearly all content on the Internet have enabled these actors to easily connect with other like-minded individuals and form almost entirely closed communities that reinforce their own views.

Research into the metaverse as the successor to social media and the mobile Internet can provide important insights into how malevolent actors could creatively use the metaverse. While we generally agree with Joe Whittaker and others (Whittaker 2022; Valentini, Lorusso and Stephan 2020) that distinguishing between offline and online radicalisation does not make sense from an analytical perspective, the way in which malevolent actors are currently using social media could give an idea of the metaverse of the future.

It is generally recognised that malevolent actors (with different ideological backgrounds) began to make use of the Internet and its possibilities at an early stage (Feldman 2020; Fisher 2015; Stewart 2021; Lehmann and Schröder 2021). They used new technologies in creative ways in order to evade monitoring and detection and also to improve their own operations. As an anonymous place of countless possibilities where one can find a wealth of information tailored to one’s own interests, the Internet is a gold mine for extremists (Bertram 2016, p. 232).

While research on the radicalisation patterns of convicted jihadi terrorists has shown that offline networks played a much greater role in their radicalisation than online networks (Hamid and Ariza 2022), other research indicates that the Internet has a more important role for right-wing extremists. This applies especially to the planning of their attacks and actions (von Behr et al. 2013; Gill et al. 2017). “The Internet is largely a facilitative tool that affords greater opportunities for violent radicalization and attack planning. Nevertheless, radicalization and attack planning are not dependent on the Internet […].” (Gill et al. 2017, p. 113).

In particular, social media has been used by malevolent groups to create, target and distribute self- generated content without the traditional processes of vetting used by traditional media companies and avoiding policing and censorship from nation states (Droogan et al. 2018, p. 171). Furthermore, social media has also become an instrument of social interaction for those who are already radicalised and those they want to convince or who are interested in their activities (Conway 2017).

The introduction of the metaverse could further reinforce this momentum. By further bridging the gap between offline and online, it could be even more difficult to maintain the distinction between the two spheres of radicalisation (and extremism and terrorism). At present, offline networks provide familiarity and a close environment and are more likely to evade security services than online extremists (Hamid and Ariza 2022). The future metaverse could bring together these advantages of the offline world in an extensive and immersive digital experience. Combined with the advantages of the online world – instant mass communication and propaganda – the metaverse could become an even bigger game-changer than the Internet and social media were.

2 THE METAVERSE AS A DEMOCRATIC SPACE

The metaverse is still in the early stages of development and still has a long way to go before it reaches a certain stage of maturity in which promises and actual functionalities are implemented. It is already apparent that the risks of the metaverse are comparable to those of social media and, in the past, a response often came too late. Freedom and security will probably be the decisive variables in this technology of the future, which makes engaging with malevolent actors all the more crucial (Neuberger 2023). In the initial phase of the metaverse, it is already becoming clear that malevolent actors are finding fertile ground – as illustrated, for example, by the incidents of sexual harassment that have occurred in the current test versions of the metaverse (Bazu 2021; Bovermann 2022; Diaz 2022; Wiederhold 2022).

How can these developments be tackled? How can they be prevented before they cause harm? It will be important to ensure the democratic involvement of actors and marginalised groups in decision-making and development processes. While this would now be a genuinely reactive process in the case of social media, the developers of the metaverse still have the opportunity to build beneficial structures. Democratisation of social media is desirable from a sociopolitical perspective because it is a very powerful tool due to its widespread use and its economic and cultural importance. This power should be democratically legitimised and controlled (Engelmann et al. 2020). However, democratic safeguarding should not follow a party political pattern.

In the development of the metaverse, social media should be informative in various ways – from the creativity with which malevolent actors use new media and technologies (see above) through to the democratic involvement of users. Social media operators have already tried to take account of the aspect of participation:

  • META conceived the idea of an Oversight Board in 2018 as a body whose independent judgement could help the company make tough content decisions. This board is committed to being independent, accessible and transparent. META has granted it the authority to decide whether content should be allowed or removed.
  • Twitter has been advised by a Trust and Safety Council in the past. This consisted of various NGOs and researchers who advised the company on online security issues. Elon Musk dissolved the Council after taking over the company (The Associated Press 2022). 
  • On its YouTube video platform, Google has introduced the Priority Flagger Programme. This enables NGOs and public authorities to use highly effective tools to report content that violates the Community Guidelines. This flagged content is then reviewed by moderators as a priority. However, the deletion criteria are the same as for any other reports. The programme was revised by YouTube in 2021, which led to major criticism from the community (Meineck 2021).

In general, there seems to be a worrying trend on social media to cut back on these participative models of moderation and security in favour of artificial intelligence (AI) applications (Gorwa et al. 2020; Llansó 2020). However, AI solutions cannot and should not replace the involvement of civil society in decision-making processes and questions of democratic culture, not least because AI-supported content moderation solutions are still prone to error and lack transparency (Gillespie 2020; Gorwa et al. 2020). 

3 ENCOURAGING PARTICIPATION AND DEMOCRATIC INVOLVEMENT

In social media research and particularly in platform governance research, important approaches can be found that may help to enable a democratic and inclusive metaverse. In addition to essential cooperation between operators and governmental and non-governmental actors on issues of transparency and research, there is an emphasis in particular on actively strengthening democratic actors and narratives (Bundtzen and Schwieter 2023; Engelmann et al. 2020; Rau et al. 2022). 

This strategy is crucial in order to ensure that a state’s repressive apparatus is actually only used as a measure of last resort to stop malevolent actors. Democratic argument and discourse must be possible in an inclusive metaverse without people constantly having to fear repression and restriction. Instead, platform operators can also take steps in the metaverse to consciously and actively promote democratic actors and narratives, and thus build democratic resilience in the metaverse. 

Here too, the metaverse can take inspiration from existing approaches in the social media field, such as YouTube’s trusted flagging programme. Democratic actors, e.g. NGOs and government organisations, specialising in areas such as hate speech, group-focused enmity or strengthening democracy could have access to special reporting tools. They could also be given extended powers to contextualise questionable content. 

However, as well as reinforcing democratic narratives, the democratisation of the platform itself is a crucial factor for inclusivity. Involving users in decision-making and design processes can have enormous added value for a platform that is interested in democratic interaction. Marginalised groups and their representatives know exactly where hate and harassment may be lurking in the digital space. By involving such stakeholders at an early stage, some of the mistakes that were made on social media could be minimised from the outset.

In political practice, mini-publics have already proved effective as an instrument of user participation (Escobar and Elstub 2017; Smith and Setälä 2018). Mini-publics are groups of (randomly or systematically) selected citizens who work together over an extended period to examine socially relevant issues, with the inclusion of external sources, e.g. scientific expertise. Topics are examined, discussed and assessed from a broad range of perspectives, and the resulting recommendations are forwarded to political decision-makers (Escobar and Elstub 2017; Pek et al. 2023). One example of this is the virtual citizens’ assembly in Germany. In June 2022, its members debated the consequences of using artificial intelligence (Buergerrat.de 2022). These types of assemblies allow platform-specific topics to be discussed with the aim of ensuring that decision-making is more democratic. 

Although quite controversial (see above), platform councils can also develop potential for promoting democracy if they are able to operate independently, objectively and transparently (Haggart and Keller 2021; Rau et al. 2022). To ensure this, platform councils of this type could be based on the press and broadcasting councils that are already established in Germany, in line with the recommendations of Kettemann and Fertmann (2021). It should be noted, however, that responsibilities (geographical, practical), participants (citizens, experts, NGOs, political decision-makers) and not least powers (quasi-judicial, advisory) must be part of the social discourse and cannot yet be conclusively clarified (Cowls et al. 2022; Kettemann and Fertmann 2021). Furthermore, such councils could boost public confidence – the more diverse and transparent their line-up is and also the more publicly visible the effects of their recommendations are.

Last but not least, the aim must also be to strengthen media literacy and policy competence by means of various training opportunities. These should be designed in such a way that individuals who are not (or no longer) associated with the education system are also able to benefit from them. Here it is vital to provide the necessary tools for dealing with fake news, other manipulated or extremist content and also hate speech on the Internet. One example to mention is the Good Gaming – Well Played Democracy project directed by the Amadeu Antonio Foundation, which aims to raise the gaming community’s awareness of extremist content, among other things.

In addition, it must be noted that building a democratic metaverse is not solely a task for citizens. The creation of a digital twin in the sense of a well-fortified democracy is also important. However, according to Rau et al. (2022), this does not exclusively mean the use of repressive measures such as deletion or suppression of problematic content (see, for example, Bellanova and De Goede 2022) but also, coupled with this, the strengthening of democratic actors, e.g. through algorithmically increased visibility. In this context, the empowerment of marginalised democratic actor groups becomes especially important in order to adequately represent social diversity. They are properly trained to recognise problematic content at an early stage, for example, and can thus also be consulted for advice (Rau et al. 2022). The use of counter speech could also be another strategy for tackling extremist content in the metaverse (Clever et al. 2022; Hangartner et al. 2021; Kunst et al. 2021; Morten et al. 2020). The term (digital) counter speech refers to comments or other content posted as a response to hate speech in order to minimise and weaken the impact of it or to support potential victims (Ernst et al. 2022; Garland et al. 2022). In this regard, studies have shown that counter speech can be an effective means of tackling extremist content and reducing it effectively (Garland et al. 2022; Hangartner et al. 2021). In the context of newer technological complexes, e.g. AI, consideration is currently being given to implementing counter speech automatically in certain circumstances, although final concepts and responsibilities are still the subject of intensive discussion (Clever et al. 2022). 

In addition to participatory methods, legislation can also be used to prevent extremist content. In Germany, the dissemination of unconstitutional symbols and signs is forbidden and perpetrators can be prosecuted. Germany’s Network Enforcement Act (Netzwerkdurchsetzungsgesetz, NetzDG) also provides a legal framework for dealing with hate crime on social media. Accordingly, the Terrorist Content Online Regulation (European Union 2021) requires platform operators offering services in the EU to remove or block reported terrorist content within one hour. Recent results of extremism research indicate, however, that so-called legal but harmful content is already proving to be a major challenge and is likely to be of significance in the metaverse as well (Jiang et. al. 2021; Rau et al. 2022). This includes, for example, digital content that may have a subtle radicalising effect but is not unlawful. However, it should be noted in this regard that content moderation must comply with the constitutional principle of free speech. Consequently, it is to be assumed that the ongoing discussion on the relationship between freedom and security will also significantly influence the design of the metaverse and will or must be the result of a negotiation process involving society as a whole in order to guarantee the democratic dimension.

4 DISCUSSION

If the immersiveness of the metaverse measures up to Mark Zuckerberg’s vision, it is very likely to have a huge impact on our everyday lives and on social interaction. This immersiveness would mean that the operators of the metaverse (or metaverses) would need to deal intensively with questions of democratisation. Not only would the state probably play a (yet to be defined) role in a metaverse, its users must also be enabled to participate democratically in it. This would help to make the platform inclusive and as safe as possible from malevolent actors.

Building on the social media research of recent decades, there are many common points of reference which can support and steer the design of a democratic metaverse. As mentioned above, the metaverse is still at an early stage of development. However, given the rapid pace of advancement, it is vital to support this process, stay on the ball and take an active role in discussions. A multi-perspective approach from all stakeholders involved is also relevant to ensure a balance between security and freedom for all users. The possibilities outlined here for building a metaverse present some solutions for implementing democratic pillars. In summary, the following solutions should be deployed by platform operators:

  • early implementation of methods for user participation, e.g. mini-publics or independent platform councils
  • strengthening of democratic actors and inclusion of marginalised groups
  • reference to existing scientific research findings on social media, hate speech and (digital) extremism, as well as open cooperation with research institutions
  • offer of educational opportunities in cooperation with democratic actors

Final and concrete implementation is currently still the subject of lively discussion. However, the status of the early development phase of the metaverse is encouraging active participation, which is also reflected in this Immersive Democracy Project and can be understood as an invitation to this process. Participation is not a panacea for the dangers that lurk in the digital space. But it is an important source of support that can help to empower marginalised groups or individuals in specific ways and thus give them the tools to work together with operators against discrimination and hate in the metaverse. Now is the time to develop these tools and make sure that a future metaverse is as safe and secure as possible for everyone.

BIBLIOGRAPHY

Bazu, Tanya (2021): The metaverse has a groping problem already. MIT Technology Review. Available online at https://www.technologyreview.com/2021/12/16/1042516/the-metaverse-has-a-groping-problem/, last checked on 28 September 2022.

von Behr, Ines; Reding, Anaïs; Edwards, Charlie; Gribbon, Luke (2013): Radicalisation in the Digital Era: The Use of the Internet in 15 Cases of Terrorism and Extremism | Office of Justice Programs. In: RAND Europe.

Bellanova, Rocco; De Goede, Marieke (2022): Co‐Producing Security: Platform Content Moderation and European Security Integration. In: JCMS: Journal of Common Market Studies 60 (5), pp. 1316–1334. https://doi.org/10.1111/jcms.13306

Bertram, Luke (2016): Terrorism, the Internet and the Social Media Advantage: Exploring how terrorist organizations exploit aspects of the internet, social media and how these same platforms could be used to counter-violent extremism. In: Journal for Deradicalization 2016 (7), pp. 225–252.

Bovermann, Philipp (2022): Online-Belästigungen im Metaverse – Am eigenen Leib. Süddeutsche Zeitung. Available online at https://www.sueddeutsche.de/kultur/metaverse-vr-virtual-reality-microsoft-sexuelle-belaestigung-1.5519527?print=true, last checked on 4 February 2022.

Buergerrat.de (2022): Bürgerrat diskutierte über künstliche Intelligenz. Buergerrat.de. Available online at https://www.buergerrat.de/aktuelles/buergerrat-diskutierte-ueber-kuenstliche-intelligenz/, last checked on 22 June 2023.

Bundtzen, Sara; Schwieter, Christian (2023): Datenzugang zu Social-Media-Plattformen für die Forschung: Lehren aus bisherigen Maßnahmen und Empfehlungen zur Stärkung von Initiativen inner- und außerhalb der EU. Berlin: Institute for Strategic Dialogue (ISD).

Clever, Lena; Klapproth, Johanna; Frischlich, Lena (2022): Automatisierte (Gegen-)Rede? Social Bots als digitales Sprachrohr ihrer Nutzer*innen. In: Julian Ernst, Michalina Trompeta and Hans-Joachim Roth (eds.): Gegenrede digital: Neue und alte Herausforderungen interkultureller Bildungsarbeit in Zeiten der Digitalisierung. Wiesbaden: Springer Fachmedien, (Interkulturelle Studien), pp. 11–26. https://doi.org/10.1007/978-3-658-36540-0_2

Conway, Maura (2017): Determining the Role of the Internet in Violent Extremism and Terrorism: Six Suggestions for Progressing Research. In: Studies in Conflict & Terrorism Routledge, 40 (1), pp. 77–98. https://doi.org/10.1080/1057610X.2016.1157408

Cowls, Josh; Darius, Philipp; Santistevan, Dominiquo; Schramm, Moritz (2022): Constitutional metaphors: Facebook’s “supreme court” and the legitimation of platform governance. In: New Media & Society. https://doi.org/10.1177/14614448221085559

Cropley, David H.; Kaufman, James C.; Cropley, Arthur J. (2008): Malevolent Creativity: A Functional Model of Creativity in Terrorism and Crime. In: Creativity Research Journal 20 (2), pp. 105–115. https://doi.org/10.1080/10400410802059424

Diaz, Adriana (2022): Disturbing reports of sexual assaults in the metaverse: ‘It’s a free show’. New York Post. Available online at https://nypost.com/2022/05/27/women-are-being-sexually-assaulted-in-the-metaverse/, last checked on 28 September 2022.

Droogan, Julian; Waldek, Lise; Blackhall, Ryan (2018): Innovation and terror: an analysis of the use of social media by terror-related groups in the Asia Pacific. In: Journal of Policing, Intelligence and Counter Terrorism Routledge, 13 (2), pp. 170–184. https://doi.org/10.1080/18335330.2018.1476773

Engelmann, Severin; Grossklags, Jens; Herzog, Lisa (2020): Should users participate in governing social media? Philosophical and technical considerations of democratic social media. In: First Monday. https://doi.org/10.5210/fm.v25i12.10525

Ernst, Julian; Trompeta, Michalina; Roth, Hans-Joachim (2022): Gegenrede digital – Einleitung in den Band. In: Julian Ernst, Michalina Trompeta and Hans-Joachim Roth (eds.): Gegenrede digital. Wiesbaden: Springer Fachmedien Wiesbaden, (Interkulturelle Studien), pp. 1–7. https://doi.org/10.1007/978-3-658-36540-0_1

Escobar, Oliver; Elstub, Stephen (2017): Forms of Mini-Publics: An introduction to deliberative innovations in democratic practice. (Research and Development Note) New Democracy.

European Union (2021): Regulation (EU) 2021/784 of the European Parliament and of the Council of 29 April 2021 on addressing the dissemination of terrorist content online (Text with EEA relevance). OJ L.

Garland, Joshua; Ghazi-Zahedi, Keyan; Young, Jean-Gabriel; Hébert-Dufresne, Laurent; Galesic, Mirta (2022): Impact and dynamics of hate and counter speech online. In: EPJ Data Science 11 (1), p. 3. https://doi.org/10.1140/epjds/s13688-021-00314-6

Gill, Paul; Corner, Emily; Conway, Maura; Thornton, Amy; Bloom, Mia; Horgan, John (2017): Terrorist Use of the Internet by the Numbers: Quantifying Behaviors, Patterns, and Processes. In: Criminology & Public Policy 16 (1), pp. 99–117. https://doi.org/10.1111/1745-9133.12249

Gillespie, Tarleton (2020): Content moderation, AI, and the question of scale. In: Big Data & Society 7 (2). https://doi.org/10.1177/2053951720943234

Gorwa, Robert; Binns, Reuben; Katzenbach, Christian (2020): Algorithmic content moderation: Technical and political challenges in the automation of platform governance. In: Big Data & Society 7 (1). https://doi.org/10.1177/2053951719897945

Haggart, Blayne; Keller, Clara Iglesias (2021): Democratic legitimacy in global platform governance. In: Telecommunications Policy 45 (6). https://doi.org/10.1016/j.telpol.2021.102152

Hamid, Nafees; Ariza, Cristina (2022): Offline Versus Online Radicalisation: Which is the Bigger Threat? London: Global Network on Extremism & Technology.

Hangartner, Dominik et al. (2021): Empathy-based counterspeech can reduce racist hate speech in a social media field experiment. In: Proceedings of the National Academy of Sciences 118 (50). https://doi.org/10.1073/pnas.2116310118

Jiang, Jialun Aaron; Scheuerman, Morgan Klaus; Fiesler, Casey; Brubaker, Jed R.; Alexandre Bovet (ed.) (2021): Understanding international perceptions of the severity of harmful content online. In: PLOS ONE 16 (8). https://doi.org/10.1371/journal.pone.0256762

Kettemann, Matthias C.; Fertmann, Martin (2021): Die Demokratie Plattformfest Machen: Social Media Councils als Werkzeug zur gesellschaftlichen Rückbindung der privaten Ordnungen digitaler Plattformen. Potsdam-Babelsberg: Friedrich-Naumann-Stiftung.

Kunst, Marlene; Porten-Cheé, Pablo; Emmer, Martin; Eilders, Christiane (2021): Do “Good Citizens” fight hate speech online? Effects of solidarity citizenship norms on user responses to hate comments. In: Journal of Information Technology & Politics 18 (3), pp. 258–273. https://doi.org/10.1080/19331681.2020.1871149

Llansó, Emma J (2020): No amount of “AI” in content moderation will solve filtering’s prior-restraint problem. In: Big Data & Society 7 (1). https://doi.org/10.1177/2053951720920686

Meineck, Sebastian (2021): Trusted Flagger: YouTube serviert freiwillige Helfer:innen ab. netzpolitik.org. Available online at https://netzpolitik.org/2021/trusted-flagger-youtube-serviert-freiwillige-helferinnen-ab/, last checked on 27/06/2023.

Morten, Anna; Frischlich, Lena; Rieger, Diana (2020): Gegenbotschaften als Baustein der Extremismusprävention. In: Josephine B. Schmitt, Julian Ernst, Diana Rieger und Hans-Joachim Roth (eds.): Propaganda und Prävention. Wiesbaden: Springer Fachmedien Wiesbaden, pp. 581–589. https://doi.org/10.1007/978-3-658-28538-8_32

Neuberger, Christoph (2023): Sicherheit und Freiheit in der digitalen Öffentlichkeit. In: Nicole J. Saam and Heiner Bielefeldt (eds.): Sozialtheorie. Bielefeld: transcript Verlag, pp. 297–308.

Pek, Simon; Mena, Sébastien; Lyons, Brent (2023): The Role of Deliberative Mini-Publics in Improving the Deliberative Capacity of Multi-Stakeholder Initiatives. In: Business Ethics Quarterly 33 (1), pp. 102–145. https://doi.org/10.1017/beq.2022.20

Rau, Jan; Kero, Sandra; Hofmann, Vincent; Dinar, Christina; Heldt, Amélie P. (2022): Rechtsextreme Online-Kommunikation in Krisenzeiten: Herausforderungen und Interventionsmöglichkeiten aus Sicht der Rechtsextremismus- und Platform-Governance-Forschung. In: Arbeitspapiere des Hans-Bredow-Instituts SSOAR – GESIS Leibniz Institute for the Social Sciences. https://doi.org/10.21241/SSOAR.78072

Smith, Graham; Setälä, Maija (2018): Mini-Publics and Deliberative Democracy. In: Andre Bächtiger, John S. Dryzek, Jane Mansbridge and Mark Warren (eds.): The Oxford Handbook of Deliberative Democracy. Oxford University Press, pp. 299–314. https://doi.org/10.1093/oxfordhb/9780198747369.013.27

The Associated Press (2022): Musk’s Twitter has dissolved its Trust and Safety Council. National Public Radio (NPR). Washington, DC, 12/12/2022.

Wiederhold, Brenda K. (2022): Sexual Harassment in the Metaverse. In: Cyberpsychology, Behavior, and Social Networking 25 (8), pp. 479–480. https://doi.org/10.1089/cyber.2022.29253.editorial

Stay tuned to the latest news:

Current publications and dates in the newsletter