From Gaming to Grooming: How Far-Right Extremists Exploit Video Game-Adjacent Platforms for Radicalization and Recruitment
Abstract
Research on extremism in gaming has often focused on the content of video games themselves. Much less attention has been given to the digital spaces that surround gaming, including livestreaming platforms, chat systems, and creator communities where audiences spend far more time than in the games. This study examines how far-right actors exploit these gaming-adjacent platforms, including DLive, Twitch, and Odysee, to spread propaganda, encourage cognitive radicalization, and move followers into more extreme online environments.
Drawing on three detailed case studies, the article shows how extremists use the interactive features of these platforms, such as real-time chat, donations, audience participation, and cross-platform migration, to weave ideological messages into familiar entertainment formats. These practices do not build “loyal fanbases;” they create ongoing pathways for radicalization, making extremist narratives feel social, participatory, and emotionally engaging.
The findings demonstrate that the risk does not lie in video games themselves but in the social infrastructure that surrounds them. These insights have direct implications for platform governance, counter-extremism practitioners, and those working in the gaming industry. Understanding how extremists repurpose these digital ecosystems is essential for monitoring fringe online spaces and reducing opportunities for ideological grooming and mobilization.
Introduction
Far-right extremists have increasingly turned to non-traditional cultural and digital spaces to spread propaganda and recruit followers. These include not only music, sports, or university campuses, but also less-regulated online environments that allow for real-time interaction and peer-driven engagement. As content moderation intensifies on mainstream platforms, extremists have migrated to alternative infrastructures where moderation is minimal or ideologically permissive.[1]
Among these alternative spaces, video game–adjacent platforms—digital environments that facilitate livestreaming, chat, and community-building around gaming—have emerged as particularly effective arenas for ideological diffusion. Platforms such as DLive, Odysee, and Twitch are not video games themselves, but orbit the gaming ecosystem by supporting content creators, hosting tournaments, and enabling ongoing social interaction. Their interactive design makes them especially appealing to young, digitally fluent audiences and simultaneously attractive to far-right actors seeking to bypass conventional content controls.[2]
Existing literature on extremism and gaming has primarily examined the ideological content of games themselves—exploring how violent narratives, enemy typologies, or extremist symbolism are embedded in gameplay.[3] In recent years, however, a small body of scholarship has begun to shift attention toward the infrastructures surrounding games. Emerging contributions have addressed extremist exploitation of gaming-adjacent platforms such as Twitch, Discord, and DLive, providing important descriptive accounts of activity and moderation challenges.[4]
While these works make a valuable contribution to understanding extremist activity in gaming-adjacent environments, they remain primarily descriptive or platform-specific, often focusing on single-platform monitoring (e.g., ISD Global’s analysis of extremist use of DLive).
This article advances the discussion by conducting sustained qualitative analysis of platform affordances—such as livestream chat, gamified incentives, and cross-platform audience migration—and by examining how these features enable cognitive radicalization through three in-depth case studies of far-right actors.
Drawing on sustained qualitative engagement with these three streamers, the article explores how ideological content is seeded, how users are encouraged to migrate to more radical spaces, and how these platforms facilitate not only propaganda dissemination but also recruitment, radicalization, and financial support. Finally, the paper contributes to the growing scholarship on online radicalization by examining how cognitive radicalization unfolds through everyday interactions within gaming-adjacent spaces. In this literature, cognitive radicalization is described as the internalization of extremist beliefs and worldviews, typically occurring prior to behavioral change.[5]
Yet existing research has not sufficiently explored how this process manifests through the rhythms and social dynamics of gaming-adjacent platforms. Recent studies show that extremists exploit the communicative and symbolic affordances of these environments – such as follower interaction, gamified engagement, shared language, and ambient ideological cues – to lower entry barriers to extremism and foster ideological convergence.[6]
The article proceeds in four sections: First, it reviews existing scholarship on cognitive radicalization and the affordances of gaming-adjacent platforms. Second, it outlines the research methodology. Third, it presents three qualitative case analyses of far-right streamers active on DLive, Twitch, and Odysee. Finally, it discusses the broader implications of these findings for understanding extremist adaptation in youth-oriented digital environments.
From Game Content to Communication Infrastructures
Early research on the nexus between gaming and extremism largely focused on the ideological content of video games – how jihadist and far-right actors embedded symbolic violence, enemy typologies, or nationalist themes into game design These studies emphasized the immersive potential of gameplay to reinforce in-group identity and justify violence. However, they often treated the player as a passive consumer, overlooking the participatory dynamics that define gaming cultures.[7]
More recent work has shifted from analyzing game content to examining the broader communication infrastructures surrounding games: livestreaming services (Twitch, DLive), in-game chat functions, and gaming-adjacent platforms like Discord. Kowert, Newhouse and Ruiz, observe that gaming environments increasingly resemble social platforms, featuring “networked publics” and participatory tools that enable both community-building and the circulation of extremist sentiment.[8] These features include interactive elements such as group chat, follower incentives, voice channels, and symbolic signaling – what they describe as dynamics that can foster ideological alignment over time.
This perspective reframes gaming as a sociotechnical system: a dynamic infrastructure through which identity, ideology, and community are negotiated. In these settings, extremist actors use livestream chats, Q&A sessions, and real-time feedback loops to identify vulnerable users, test ideological messaging, and guide users toward more radical platforms.[9] This mode of engagement supports a collaborative radicalization process.
Jenkins’s concept of “participatory culture” helps illuminate this process: low entry barriers, informal mentorship, and peer-to-peer learning generate strong social bonds.[10] Livestream-chat dynamics on platforms like DLive and Odysee not only expose users to radical content but invite them to actively co-produce meaning in emotionally resonant, performative ways.[11]
The shift from analyzing game-centered content to examining sociotechnical ecosystems has opened up new conceptual pathways. Platforms such as Discord are no longer just entertainment venues; they are cultural hubs where users negotiate identity and encounter ideology. This transition aligns with insights from sociology and media studies that conceptualize online subcultures as participatory and affective spaces.[12]
The Dual Dynamics of Organic and Strategic Radicalization
Radicalization in gaming-adjacent spaces takes different forms. At one end are organic processes, where users encounter extremist ideas gradually through memes, inside jokes, shared reactions, and emotional bonding. At the other end are deliberate strategies, including influencer outreach, branded propaganda, and coordinated migration across platforms designed to push audiences toward more radical spaces.[13]
Papacharissi helps explain why these processes work. People use digital spaces to express emotion, identity, and frustration in informal, real-time exchanges.[14] Phillips and Milner show how humor and irony often blur intentions online, making it easier for extremist ideas to circulate as “just jokes.”[15] Jenkins adds that low barriers to participation and peer-to-peer interaction help normalize ideas gradually, even when users do not see themselves as politically involved.[16]
Strategic engagement on these platforms often resembles online marketing. Actors develop a recognizable style, select and package content carefully, and try to increase their algorithmic visibility to reach wider audiences. On platforms such as Discord, Twitch, and DLive, they use public channels and open livestreams to create the impression of everyday, spontaneous conversation, while private backchannels coordinate what messages to promote and how to frame them.[17] This mixture of planning and informal presentation creates a counter-cultural appeal built on digital aesthetics, memes, and humor – elements that resonate with young people who feel disconnected from mainstream culture.[18]
What becomes clear from these patterns is that organic and strategic radicalization are closely linked. They often unfold at the same time, and each reinforces the other. Organic interaction builds trust and emotional familiarity, while strategic actors use that trust to introduce clearer ideological messages. Understanding how these two processes work together is essential for explaining how radicalization develops within gaming-adjacent environments.
Platform-Specific Affordances and Actor Strategies
Gaming-adjacent platforms differ in design, moderation, and audience, but far-right actors exploit them in similar ways. Services such as DLive and Odysee present themselves as censorship-resistant spaces, attracting individuals and groups removed from mainstream platforms.[19]
DLive began as a blockchain-based alternative to YouTube and is known for minimal moderation and a decentralized reward system. Streamers earn cryptocurrency through viewer donations and platform incentives, allowing ideological content to be monetized.[20] Several far-right figures banned from YouTube moved to DLive to maintain their audiences and income. The platform’s reward structure, which encourages viewers to participate in streams for shared crypto benefits, creates a gamified economy around ideological engagement.[21]
Odysee, built on the LBRY blockchain protocol, offers a video-sharing system that mirrors YouTube’s interface, making the transition easy for users. Its branding as a censorship-free environment appeals to extremist creators seeking long-form or permanent ideological content. The blockchain architecture reduces the risk of takedowns, turning Odysee into a stable archive for radical narratives that can be linked from other social media platforms or embedded in chat groups.[22]
Twitch, although a mainstream Amazon-owned platform, has affordances that make it appealing as well. Its livestreaming tools, real-time chat, and strong gaming culture allow extremist messaging to be woven into gameplay, humor, or community banter. Even with stricter moderation, actors have used Twitch to normalize extremist worldviews through memes, coded language, and personality-driven streams. The parasocial connection between streamers and viewers, which grows through daily interaction, can support long-term ideological grooming.[23]
Across these platforms, interactivity is central. Livestream chats, Q&A segments, follower shout-outs, and other participatory features blend entertainment with ideological messaging. These sociotechnical affordances help shift audiences from passive viewers to active contributors, co-creating meaning alongside streamers. In this sense, the platforms do more than host extremist content; they shape the conditions in which radicalization can take root.[24]
Despite differences in governance, technical structures, and user communities, from DLive’s crypto incentives to Twitch’s mainstream visibility, far-right actors adapt their tactics across all three. Their goals remain consistent: maintaining reach, bypassing moderation, and cultivating loyal ideological communities. This convergence illustrates how diverse platforms can be repurposed toward similar radicalizing ends.[25]
These dynamics have had concrete real-world consequences. In several high-profile attacks, including the 2019 Christchurch shooting, the 2019 Halle synagogue attack, and the 2022 Buffalo shooting, gaming-adjacent platforms such as Twitch and Discord were used for planning, coordination, and in some cases broadcasting the violence itself.[26] These events show how features like real-time interaction, parasocial engagement, and livestream virality can be weaponized to turn acts of violence into a participatory spectacle.
The same affordances that support community-building, including visibility, interactivity, and reward systems, can also accelerate radicalization. As Jenkins and Papacharissi note, digital infrastructures shape what becomes visible, shareable, and emotionally engaging.[27] When hate becomes gamified, profitable, and socially reinforced, radicalization shifts from being possible to being systematically encouraged.[28]
As Christian Picciolini, a former white supremacist and former member of a neo-Nazi organization, explained in a personal interview with the authors, online gaming chats such as those in Call of Duty were often used to “test” newcomers’ reactions to slurs or hostile humor.[29] Those who responded positively were invited into more private and more radical spaces. This vetting process, enabled by anonymity and the informal tone of gaming environments, served as an effective mechanism for ideological filtering and early-stage grooming.
Research Methodology
This article draws on three case studies that illustrate how gaming-adjacent platforms are exploited by far-right extremists to radicalize and recruit followers. The subjects of analysis, referred to as Person A, Person B, and Person C, represent different modes of engagement with these platforms. Person A is a leader in a British far-right organization who uses Odysee to reach young audiences. Person B, affiliated with the Groyper movement, operates on DLive and uses gaming-themed streams to spread propaganda and solicit donations. Person C, a supporter of QAnon, does not play video games but relies on Twitch and DLive’s proximity to gaming culture to appeal to younger users.
The study did not involve systematic monitoring of specific channels over a fixed period. Instead, it employed targeted qualitative tracking. Researchers followed public conversations and streams when streamers promoted upcoming content or when noteworthy patterns emerged. This opportunistic sampling aligns with the study’s goal of examining platforms as dynamic and evolving spaces. It also means the findings are not comprehensive or statistically representative; the aim is analytical depth rather than generalizability.
Ethical considerations guided the research throughout. The study relied exclusively on publicly accessible content, involved no direct interaction with users, and did not collect or disclose any personal identifiers. This approach allowed the authors to observe radicalization and propaganda efforts as they unfolded in real time. At the same time, it excludes private or encrypted interactions, which likely play a major role in the radicalization process. Because the research did not involve interviews or direct engagement with users, it cannot fully account for individual motivations or how audiences interpret extremist messaging.
The case study format is well-suited to illustrating different uses of gaming-adjacent platforms, but it captures only a portion of the broader far-right ecosystem online. For this reason, the examples presented here should be understood as illustrative rather than exhaustive.
Case study: Person A
Person A is a British neo-Nazi activist and the founder of the far-right organization Patriotic Alternative (PA). His trajectory shows a deliberate use of gaming and livestreaming infrastructures to attract, radicalize, and mobilize younger audiences. By relying on platform features such as real-time chat, audience targeting, and gamified participation, he has built a digital space that mixes ideological messaging with entertainment, making his content accessible and appealing to younger users.
Early in his political development, Person A joined the British National Party (BNP), where he led its youth division before being expelled.[30] In 2019, he founded PA, which quickly gained traction among Gen Z users drawn to his polished online persona and emotionally engaging style. This combination of ideological clarity and digital fluency positioned him as a central figure in youth-focused far-right mobilization.[31]
Platform Migration and Audience Expansion
Before his deplatforming in April 2021, Person A had approximately 100,000 followers on YouTube and 60,000 on Twitter. After his removal, he shifted to a network of alternative platforms, where he continued disseminating antisemitic content and incorporating gaming streams into his outreach.
By early 2025, his followership on these platforms had grown to 22,000 on BitChute, 13,000 on Telegram, and 8,000 on Odysee. This upward trend suggests that his message maintained, and in some spaces even increased, its resonance despite — or because of — exclusion from mainstream platforms.[32]
Gamified Recruitment and Ideological Framing
One of Person A’s most notable tactics is his use of livestreamed gaming tournaments, particularly Call of Duty: Warzone, to draw in younger participants. These tournaments operate as mechanisms of gamified recruitment, where gaming is used as an informal entry point into ideological spaces. As reported by The Sun, these events are framed not simply as entertainment, but as opportunities to connect with disengaged youth and introduce them to PA’s worldview.[33]
Cross-Platform Radicalization Funnel
Through cross-platform coordination, Person A guides viewers from mainstream or semi-mainstream platforms toward encrypted or ideologically saturated channels. For example, during a livestream on Odysee on October 3, 2021, he directed viewers to join “PA Gaming,” a Telegram group that, despite its framing as a gaming space, featured racist memes, podcasts, and ideological material. On January 24, 2022, the group promoted a podcast in which new parents discussed “all things new white life, Patriotic Alternative, the state of Britain, and our common cause.” Other posts encouraged participation in neo-Nazi forums, including a November 19, 2021, invitation to a game session hosted by the Nordic Resistance Movement, and an October 14, 2021, announcement of a nationalist-themed game development project.
Parasocial Dynamics and Influencer Networks
Person A amplifies his reach by collaborating with other far-right influencers. One such figure — referred to here as “the Streamer” — is a Holocaust denier and white supremacist with more than 152,000 followers across YouTube, BitChute, Telegram, and X. Their joint streams strengthen parasocial bonds with viewers, fostering a sense of community built on shared humor, aesthetics, and ideology. These emotional and social ties help legitimize Person A’s messaging while embedding it within a broader ecosystem of far-right influencers.
Ideological Normalization Through Gaming Content
In addition to playing widely popular, non-political video games, Person A also promotes far-right-developed titles, including those produced by KVLT Games. KVLT Games functions as a cultural extension of the Identitarian milieu, creating “nationalist” games that combine satire, memes, and coded political references—allowing the studio to maintain deniability while still circulating far-right narratives within gaming cultures.[34]These titles are presented as “nationalist alternatives” to regular entertainment games and are used to further normalize extremist ideas within gaming subcultures. During one stream, Person A described a KVLT release as “a nationalist game planned by people who are nationalists” and urged followers to support the studio. In the accompanying chat, users discussed promoting the game to “normie kids,” signaling a deliberate attempt to push extremist content beyond existing ideological boundaries.
Taken together, Person A’s activities reflect a strategic and multifaceted use of gaming-affiliated infrastructures. Far from being peripheral, these practices demonstrate how entertainment ecosystems can function as pipelines for ideological recruitment, mobilization, and reinforcement. By embedding extremist content within the rhythms and aesthetics of digital culture, Person A illustrates a broader trend in contemporary far-right radicalization. One that leverages participatory media to foster belonging, normalize hate, and expand reach.
Case study: Person B
Background
Person B is a U.S.-based white supremacist whose trajectory reflects the adaptive strategies of far-right actors navigating organizational collapse, rebranding, and shifting digital ecosystems. Beginning in 2016, he joined Identity Evropa, a white identitarian group aimed at rebranding white nationalism for younger audiences. His activities included offline recruitment, such as approaching students at San Diego State University.[35]
Following the group’s exposure after the 2017 Unite the Right rally in Charlottesville, it rebranded as the American Identity Movement (AIM). As Miller-Idriss notes, this was part of a broader alt-right effort to aestheticize extremist ideology and reduce reputational risk. [36] AIM adopted visual updates, euphemistic language, and a strategic distancing from explicit violence.
When AIM dissolved in 2020, Person B transitioned into the “Groyper Army,” a loosely organized alt-right coalition espousing antisemitic, anti-immigrant, and conspiratorial rhetoric. He quickly emerged as a prominent figure within this sphere, leveraging his activist pedigree and digital fluency to build a cross-platform following. His trajectory illustrates the far-right’s capacity to recalibrate across brands, publics, and platforms.
Platform migration and follower metrics
Since 2021, Person B’s audience has grown substantially. On DLive, a video livestreaming platform with minimal content moderation, his following increased from approximately 6,700 in 2021 to 14,862 as of July 2025. His Telegram channel has maintained around 6,700 subscribers, while his Gab account has 1,600 followers. On Subscriberstar, he retains over 50 paying subscribers, and while Substack metrics are unavailable, he posts regularly on the platform. This sustained presence across alternative platforms demonstrates the continued resonance of his messaging in loosely regulated digital environments.
Strategic use of gamified recruitment
Person B operates primarily on gaming-adjacent platforms, especially DLive, where he broadcasts extended gameplay sessions mixed with political commentary and far-right rhetoric. His streams typically run for several hours, combining entertainment with ideological messaging in an environment shaped by real-time interaction.
DLive’s chat and voice functions enable direct engagement with viewers, fostering parasocial dynamics and affective recruitment. Person B frames his broadcasts as opportunities to “cover politics, current events, gaming, and more,” casting himself simultaneously as entertainer, commentator, and ideological guide.
In doing so, he transforms a gaming-adjacent platform into a central hub for radicalization and ideological dissemination, where ideology is woven into emotionally resonant and participatory media environments.
Embedding far-right ideology in interactive livestreams
The central component of Person B’s strategy is integrating extremist talking points into casual, often humorous livestreams. He encourages viewers to submit questions in the live chat and responds in real time via audio, fostering interactivity that normalizes radical discourse.
For instance, during a December 24, 2021, stream, he repeatedly made derogatory remarks about homosexuals and African Americans. In another broadcast on January 27, 2022, while playing Dark Souls III, he promoted COVID-19 conspiracy theories, stating:
“It’s fascinating that Europe is waking up. Countries are ditching mandates. If they try that lockdown shit here again, I’m not complying.”[37]
Such moments show how political extremism become embedded within entertainment content, wrapped in the affective environment of gaming and audience interaction.
Redirecting followers to more radical spaces
Beyond livestreaming, Person B actively promotes migration to adjacent platforms with even fewer content restrictions. He directs viewers to his Telegram, Gab, and Substack accounts, where his messaging becomes more explicitly radical.
On December 1, 2021, he posted a Telegram message promoting the “Great Replacement” theory, a conspiratorial narrative claiming that white populations are being systematically replaced by non-white immigrants. On October 8, 2021, he encouraged his audience to read his Subscriberstar article detailing “what the dissident right has done well and what needs to happen for us to win.”
This strategy of platform diversification deepens ideological exposure, allowing Person B to strengthen radicalization across multiple layers of digital interaction.
Affective recruitment and influencer collaboration
Person B builds loyalty by reinforcing parasocial ties. He publicly acknowledges major donors by name, responds to viewer comments during streams, and frames his community as a shared ideological project.
He also collaborates with adjacent influencers in the Groyper sphere, participating in co-streams and cross-posting their content. This mutually reinforcing network amplifies messaging, strengthens ideological frames, and normalizes participation in extremist spaces.
Ideological reinforcement through game content
The gameplay itself plays an important role. It becomes a familiar backdrop through which Person B delivers ideological messages. As he plays action or strategy games, he weaves in commentary that reflects far-right narratives. Because viewers are already relaxed and immersed in the flow of the game, the political content feels less confrontational and is easier to absorb.
This approach mirrors a broader pattern in contemporary extremist communication: using popular culture spaces to introduce radical ideas in ways that lower resistance and increase emotional engagement.
Monetization and financial infrastructure
Person B monetizes his activity through several mechanisms embedded within the same participatory platforms he uses for recruitment and ideological dissemination. On DLive, he receives donations via the platform’s native currency system (“lemons”), with some viewers contributing as much as $100 in a single stream (January 27, 2022).
He also offers tiered subscription options on Subscriberstar. A $5/month tier provides access to weekly content and the title of “supporter,” while higher tiers—up to $250/month—offer perks such as private content and signed material. As of July 2025, he maintains over 50 active subscribers on the platform.
In parallel, he publishes cryptocurrency wallet addresses on both DLive and Substack. As of December 2021, his Bitcoin and Ethereum wallets contained approximately $21,600 and $16,000, respectively. These wallets are publicly updated, providing an ongoing mechanism for anonymous contributions.
Such financial strategies reveal how ideological affinity is transformed into economic support. By embedding donation opportunities within streams and community channels, Person B blurs the line between political participation and financial patronage.
Person B exemplifies how far-right actors exploit the interactive affordances of gaming-adjacent platforms to cultivate ideological communities, deepen affective engagement, and sustain revenue. His activity highlights the convergence of entertainment, propaganda, and monetization—transforming gaming culture into a Trojan horse for radicalization. The expansion of his follower base and cross-platform reach underscores the adaptability of extremist influencers operating within a fragmented and minimally moderated digital ecosystem.
Case study: Person C
Background and Platform Choice
Person C is a prominent U.S.-based conspiracy theorist and a committed adherent of the QAnon movement. Unlike the other two cases in this study, he is not a gamer. However, he strategically uses gaming-adjacent platforms, particularly DLive and Twitch, not to stream games, but to exploit their participatory culture and interactive features to disseminate conspiracy content. His activity shows how the social infrastructure of these platforms can be repurposed for ideological ends by individuals who are not part of gaming culture but recognize its affective and cultural power.
Platform Use and Follower Metrics
Despite not producing gaming content, Person C amassed a significant audience on DLive and Twitch by late 2021. According to the authors’ monitoring, in October 2021 he had approximately 19,000 followers on DLive; by January 2022, this number had doubled to 42,000. As of July 2025, however, his DLive following had declined to 14,862. He attributed earlier platform issues to DLive’s inability to handle the traffic his streams generated and encouraged followers to migrate elsewhere.
By July 2025, Person C’s Rumble channel had grown to over 207,000 followers, while his Telegram channel had approximately 159,000 subscribers. This reflects a broader migration to less regulated platforms and underscores the continued growth of his online presence despite deplatforming pressures.
Content and Messaging
In January 2022, Person C hosted a livestream that featured a 50-slide presentation asserting that President Joe Biden was not alive, and that the person in the White House was a double who had been tried in a secret military tribunal. He also used these presentations to encourage followers to “wake up,” reinforcing QAnon’s framing of reality as a layered illusion requiring insiders to decode it. He often emphasizes urgency and community loyalty, referring to viewers as “truth warriors” and labeling outsiders as “sheep.”
Cross-Platform Mobilization
Person C often directs viewers from his livestreams to his personal website, where users can access more extensive materials. The homepage includes a “New to Q?” section offering primers for newcomers, effectively onboarding new recruits. He encourages viewers to join Telegram groups and mailing lists for exclusive content, and he offers one-month free memberships to his “inner circle” through livestream promo links. This creates a tiered access model that mirrors freemium game economies, serving both ideological and financial aims by increasing commitment while generating revenue.
Aesthetic Framing and Parasocial Rhetoric
Although not a gamer, Person C closely imitates the visual and interpersonal style of gaming streams: background music, live chat, alert boxes for new donations or subscribers, and a direct-to-camera speaking style that fosters intimacy. This blending of livestream aesthetics with conspiracy content illustrates how radicalization can be embedded within platform-native formats. His conversational tone and constant engagement with viewers create a sense of horizontal, community-driven discussion, making extreme claims feel accessible and shared rather than authoritative or imposed.
Ideological Drift and Participation
Although Person C’s discourse continues to draw heavily on QAnon conspiracies, it increasingly incorporate antisemitic tropes, New World Order narratives, and apocalyptic Christian themes. His interactive streams combine prayer, pseudo-scientific claims, and nationalist messaging, creating a space that blurs the boundary between spiritual revivalism and political mobilization. Crucially, his audience is not passive: they comment, donate, and respond to him in real time. Through this ongoing involvement, viewers help sustain and normalize the ideology as it unfolds.
Discussion
The analysis of the three figure – a British neo-Nazi organizer, a U.S. Groyper-aligned streamer, and a prominent QAnon follower – shows that gaming-adjacent platforms function not simply as entertainment spaces but as sociotechnical environments that support ideological dissemination, community formation, and audience radicalization. Although the actors differ in ideology and style, their methods rely on similar platform features: livestream interfaces, real-time chat, micro-donation tools, and parasocial dynamics. Instead of presenting their messaging as formal political communication, they embed it within familiar formats drawn from gaming culture – informal conversation, humor, and continuous interaction. This framing makes ideological content feel accessible and reduces the resistance that explicit or didactic messaging often generates.
Across the cases, the interactive and routine nature of these platforms creates a sense of closeness between streamers and viewers that goes beyond typical parasocial relationships. These dynamics produce what we refer to as “ideological intimacy.” By “ideological intimacy,” we mean the sense of emotional closeness and shared purpose that forms when viewers engage with ideological messages through ongoing, informal interaction on these platforms. Although the phrase appears occasionally in earlier scholarship (e.g., Kazmi, 2018), it has not been defined or developed into an analytical concept. Here, the term is used to capture how ideology becomes intertwined with everyday social engagement in ways that gradually deepen commitment.
A consistent strategy across the three cases involves platform migration. Viewers are first engaged through accessible platforms such as Twitch, YouTube (before deplatforming), or DLive, where the tone remains casual and conversational. From there, streamers direct their audiences to less regulated spaces such as Telegram, Gab, or privately hosted websites, where ideological content becomes more explicit and community norms more rigid. This creates a layered funnel: low-barrier informal interaction at the entry point, repeated exposure and reinforcement in the middle, and highly saturated ideological environments at the end.
The participatory infrastructure of these platforms also supports monetization, which strengthens engagement and sustains activity over time. Tools such as micro-donations, cryptocurrency wallets, or subscription tiers are embedded directly within the same interactive environments where ideological messaging unfolds. These mechanisms turn affective investment into financial support and enable streamers to maintain visibility, produce content consistently, and reinforce movement-aligned narratives.
Taken together, these patterns show how gaming-adjacent platforms foster forms of engagement that are emotional, communal, and iterative. In these environments, ideological messages are not only consumed but co-experienced in real time, creating shared meaning and a sense of belonging that can accelerate radicalization. Understanding these dynamics—technical, social, and emotional—is essential for analyzing how radicalization unfolds in digital spaces that do not present themselves as political arenas, yet operate as influential sites of ideological work.
Conclusion
This study shows how far-right actors use gaming-adjacent platforms as environments that support both organic and strategic forms of radicalization. Organic dynamics unfold through everyday participation, including chat interaction, shared jokes, or casual reactions during a stream, which gradually make ideological cues feel ordinary. Strategic dynamics involve more deliberate actions, such as directing viewers to less regulated platforms, coordinating content across channels, or using familiar gaming aesthetics to make extremist messages easier to absorb.
By examining three figures across DLive, Twitch, and Odysee, the analysis illustrates how spaces commonly understood as entertainment for gamers now serve as active sites of ideological activity. These actors rely on livestream conversation, gaming-style commentary, and conspiratorial narration to introduce and reinforce extremist narratives in formats that feel familiar and engaging. Because gaming and gaming-adjacent platforms draw large youth audiences, and because many young users spend substantial time in these environments as part of their daily routines, these settings create opportunities for repeated exposure to extremist ideas. The informal and continuous interaction that characterizes these platforms can lower barriers for young users to drift toward more radical content.
The findings also show that financial features such as micro-donations, cryptocurrency wallets, and subscription tiers play a supporting role in maintaining these ecosystems. While they do not drive radicalization on their own, they help sustain the visibility and activity of these actors by turning engagement into ongoing material support.
Taken together, the study highlights the need to look beyond the content of video games and focus instead on the infrastructures, social dynamics, and everyday practices that shape participation in these spaces. Understanding how these elements operate, especially for the young users who spend extensive time in gaming environments, is essential for assessing how radicalization develops in digital settings that appear apolitical on the surface yet increasingly host meaningful extremist activity.
[1] Bunmathong, Luxinaree, and Galen Lamphere-Englund. “State of Play on Gaming & Extremism – An Annotated Bibliography.” Extremism and Gaming Research Network, 2021. doi:10.1080/1057610X.2020.1866740.
[2] Radicalisation Awareness Network (2020). ‘Extremists’ Use of Video Gaming – Strategies and Narratives.’ European Commission, RAN Communication and Narratives (RAN C&N). Available at: https://home-affairs.ec.europa.eu/whats-new/publications/ran-cn-extremists-use-video-gaming-strategies-and-narratives-online-meeting-15-17-september-2020_en (Accessed: 7 August 2025); United Nations Office of Counter-Terrorism (2022) Gaming and Violent Extremism: Research Launch on Gaming and Violent Extremism. Available at: https://www.un.org/counterterrorism/sites/www.un.org.counterterrorism/files/221005_research_launch_on_gaming_ve.pdf (Accessed: 8 August 2025).
[3] Lakomy, M. (2019). ‘Let’s Play Jihad: The Communicative Strategy of ISIS’s Videogame Propaganda.’ Perspectives on Terrorism, 13(1), pp. 17–31.; Robinson, P. and Whittaker, J. (2021). ‘Playing for Hate? Extremism, Terrorism, and Videogames.’ Studies in Conflict & Terrorism, 44(11), pp. 1001–1023.
[4] Davey, J. (2024) ‘Extremism on Gaming (‑Adjacent) Platforms’, in Schlegel, L. and Kowert, R. (eds.) Gaming and Extremism: The Radicalization of Digital Playgrounds. New York: Routledge, pp. 95–109; Allchorn, W. and Orofino, E. (2025) ‘Policing Extremism on Gaming‑Adjacent Platforms’, Frontiers in Psychology. Available at: https://www.frontiersin.org/articles/10.3389/fpsyg.2025.1537460/full (Accessed: 6 August 2025); Radicalisation Awareness Network (2021) Extremists’ use of gaming (adjacent) platforms: Insights regarding primary and secondary prevention measures. European Commission; ISD Global (2021) Gaming and extremism: The extreme right on DLive. London: Institute for Strategic Dialogue. Available at: https://www.isdglobal.org/wp-content/uploads/2021/08/03-gaming-report-dlive-1.pdf (Accessed: 7 August 2025).
[5] Vergani, M., Iqbal, M., Ilbahar, E. and Barton, G. (2020). ‘The Three Ps of Radicalization: Push, Pull and Personal.’ Violence and Gender, 7(1), pp. 55–70; Koehler, D. (2014). ‘The Radical Online: Individual Radicalization Processes and the Role of the Internet.’ Journal for Deradicalization, 1, pp. 116–134.
[6] Kowert, R., Newhouse, A. and Ruiz, B. (2024). ‘Taking it to the Extreme: Prevalence and Nature of Extremist Sentiment in Games.’ Frontiers in Psychology, 15, 1410620. Available at: https://doi.org/10.3389/fpsyg.2024.1410620 (Accessed: 7 August 2025).
[7] Lakomy, M. (2019). ‘Let’s Play Jihad: The Communicative Strategy of ISIS’s Videogame Propaganda.’ Perspectives on Terrorism, 13(1), pp. 17–31.; Robinson, P. and Whittaker, J. (2021). ‘Playing for Hate? Extremism, Terrorism, and Videogames.’ Studies in Conflict & Terrorism, 44(11), pp. 1001–1023.
[8] Kowert, R., Newhouse, A. and Ruiz, B. (2024). ‘Taking it to the Extreme: Prevalence and Nature of Extremist Sentiment in Games.’ Frontiers in Psychology, 15, 1410620. Available at: https://doi.org/10.3389/fpsyg.2024.1410620 (Accessed: 7 August 2025).
[9] Munn, L. (2019) ‘Alt-right pipeline: Individual journeys to extremism online’, First Monday, 24(6). Available at: https://firstmonday.org/ojs/index.php/fm/article/view/10108/7920 (Accessed: 7 August 2025).
[10] Jenkins, H. (2006). Convergence Culture: Where Old and New Media Collide. New York: NYU Press.
[11] Robinson, P. and Whittaker, J. (2021). ‘Playing for Hate? Extremism, Terrorism, and Videogames.’ Studies in Conflict & Terrorism, 44(11), pp. 1001–1023.
[12] Seering, J., Wang, W., Lin, J., and Kaufman, G. (2025) ‘Discord’s design encourages “third place” social media experiences’, arXiv preprint arXiv:2501.09951. Available at: https://arxiv.org/pdf/2501.09951 (Accessed: 7 August 2025); Beran, D. (2019). It Came from Something Awful: How a Toxic Troll Army Accidentally Memed Donald Trump into Office. New York: All Points Books.
[13] Davey, J. and Ebner, J. (2017). ‘The Fringe Insurgency: Connectivity, Convergence and Mainstreaming of the Extreme Right.’ Institute for Strategic Dialogue.
[14] Zizi Papacharissi, A Private Sphere: Democracy in a Digital Age (Malden, MA: Polity Press, 2010)
[15] Phillips, W. and Milner, R. M. (2017). The Ambivalent Internet: Mischief, Oddity, and Antagonism Online. Cambridge: Polity Press.
[16] Jenkins, H. (2006). Convergence Culture: Where Old and New Media Collide. New York: NYU Press.
[17] Allchorn, W. and Orofino, E. (2025) ‘Policing Extremism on Gaming‑Adjacent Platforms’, Frontiers in Psychology. Available at: https://www.frontiersin.org/articles/10.3389/fpsyg.2025.1537460/full (Accessed: 6 August 2025).
[18] Bartlett, J. and Miller, C. (2012). ‘The Edge of Violence: Towards Telling the Difference Between Violent and Non-Violent Radicalization.’ Terrorism and Political Violence, 24(1), pp. 1–21.
[19] ISD Global (2021) Gaming and extremism: The extreme right on DLive. London: Institute for Strategic Dialogue. Available at: https://www.isdglobal.org/wp-content/uploads/2021/08/03-gaming-report-dlive-1.pdf (Accessed: 7 August 2025); Matlach, P., Hammer, D. and Schwieter, C. (2023) On Odysee: The role of blockchain technology for monetisation in the far-right online milieu. Berlin: Institute for Strategic Dialogue. Available at: https://www.isdglobal.org/wp-content/uploads/2023/05/On-Odysee_The-Role-of-Blockchain-Technology-for-Monetisation-in-the-Far-Right-Online-Milieu.pdf (Accessed: 8 August 2025); Radicalisation Awareness Network (2021) Extremists’ use of gaming (adjacent) platforms: Insights regarding primary and secondary prevention measures. European Commission.
[20] Allchorn, W. and Orofino, E. (2025) ‘Policing Extremism on Gaming‑Adjacent Platforms’, Frontiers in Psychology. Available at: https://www.frontiersin.org/articles/10.3389/fpsyg.2025.1537460/full (Accessed: 6 August 2025); William Allchorn and Elisa Orofino, “Policing Extremism on Gaming-Adjacent Platforms,” Frontiers in Psychology, 2025. Available at: https://www.frontiersin.org/articles/10.3389/fpsyg.2025.1537460/full (Accessed: 6 August 2025).
[21] ISD Global (2021) Gaming and extremism: The extreme right on DLive. London: Institute for Strategic Dialogue. Available at: https://www.isdglobal.org/wp-content/uploads/2021/08/03-gaming-report-dlive-1.pdf (Accessed: 7 August 2025); Radicalisation Awareness Network (2021) Extremists’ use of gaming (adjacent) platforms: Insights regarding primary and secondary prevention measures. European Commission.
[22] Matlach, P., Hammer, D. and Schwieter, C. (2023) On Odysee: The role of blockchain technology for monetisation in the far-right online milieu. Berlin: Institute for Strategic Dialogue. Available at: https://www.isdglobal.org/wp-content/uploads/2023/05/On-Odysee_The-Role-of-Blockchain-Technology-for-Monetisation-in-the-Far-Right-Online-Milieu.pdf (Accessed: 8 August 2025); Radicalisation Awareness Network (2021) Extremists’ use of gaming (adjacent) platforms: Insights regarding primary and secondary prevention measures. European Commission.
[23] Lakomy, M. (2019). ‘Let’s Play Jihad: The Communicative Strategy of ISIS’s Videogame Propaganda.’ Perspectives on Terrorism, 13(1), pp. 17–31.; Robinson, P. and Whittaker, J. (2021). ‘Playing for Hate? Extremism, Terrorism, and Videogames.’ Studies in Conflict & Terrorism, 44(11), pp. 1001–1023.
[24] Davey, J. (2024) ‘Extremism on Gaming (‑Adjacent) Platforms’, in Schlegel, L. and Kowert, R. (eds.) Gaming and Extremism: The Radicalization of Digital Playgrounds. New York: Routledge, pp. 95–109.
[25] Ibid.
[26] Macklin, G. (2019) ‘The Christchurch Attacks: Livestream Terror in the Viral Video Age’, CTC Sentinel, 12(6), pp. 18–29. Available at: https://ctc.westpoint.edu/christchurch-attacks-livestream-terror-viral-video-age/ (Accessed: 6 August 2025); ABC News (2019) ‘Video of synagogue shooting in Germany viewed 2,000 times on Twitch before removal’, ABC News, 10 October. Available at: https://abcnews.go.com/Technology/video-synagogue-shooting-germany-viewed-2000-times-twitch/story?id=66181934 (Accessed: 6 August 2025); Office of the New York State Attorney General (2022) Investigation into the Role of Online Platforms in the May 14, 2022 Buffalo Mass Shooting. New York: Office of the Attorney General. Available at: https://ag.ny.gov/sites/default/files/buffaloshooting-onlineplatformsreport.pdf (Accessed: 6 August 2025).
[27] Jenkins, H. (2006). Convergence Culture: Where Old and New Media Collide. New York: NYU Press; Zizi Papacharissi, A Private Sphere: Democracy in a Digital Age (Malden, MA: Polity Press, 2010)
[28] Phillips, W. and Milner, R. M. (2017). The Ambivalent Internet: Mischief, Oddity, and Antagonism Online. Cambridge: Polity Press; Marwick, A. and Lewis, R. (2017). Media Manipulation and Disinformation Online. New York: Data & Society Research Institute; Donovan, J. and Boyd, D. (2021). ‘Stop the Presses? Moving from Strategic Silence to Strategic Amplification in a Networked Media Ecosystem.’ American Behavioral Scientist, 65(2), pp. 333–350.
[29] Picciolini, C. (2022). Personal interview conducted by the authors, September 2022.
[30] BNP Blogspot (2010). BNP021: The BNP Falls Out with Collett. 21 March. Available at: https://thebnp.blogspot.com/2010/03/bnp021-thebnp-falls-out-with-collett.html (Accessed: 24 July 2025).
[31] Allen, C. (2024) ‘From Baking Competitions to Forced Repatriations: Patriotic Alternative and the Hybridity of the Radical Right’, Qeios, 20 February. Available at: https://www.qeios.com/read/K599X7 (Accessed: 7 August 2025).
[32] Monitoring conducted by the authors.
[33] The Sun (2021) Far-right extremists using Call of Duty: Warzone to recruit teenagers into fascist group. 9 February. Available at: https://www.thesun.co.uk/news/13986628/call-duty-warzone-far-right-patriotic-alternative-mark-collett/ (Accessed: 7 August 2025).
[34] Nisos (2024) Gamifying Extremism: The Identitarian Movement Gets Another Video Game. October 8, 2024. Available at: https://nisos.com/research/extremism-video-game-identitarian-movement/ (Accessed: 16 November 2025).
[35] Anti-Defamation League (n.d.) Identity Evropa / American Identity Movement. Available at: https://notoleranceforantisemitism.adl.org/resources/profile/identity-evropaamerican-identity-movement (Accessed: 24 July 2025).
[36] Miller-Idriss, C. (2020). Hate in the Homeland: The New Global Far Right. Princeton: Princeton University Press, 40.
[37] Monitoring conducted by the authors, 2022.
