Stranger Danger and online fringe communities

Published
5 August, 2024

New EPFL research has found that the exchange of comments between members and non-members of fringe communities (fringe-interactions) on mainstream online platforms attracts new members to these groups. It has also suggested potential ways to curtail this growth.

Fringe communities promoting conspiracy theories and extremist ideologies have thrived on mainstream online platforms, consistently raising the question on how this growth is fueled. Now, researchers in EPFL’s School of Computer and Communication Sciences (IC) studying this phenomenon have found what they believe to be a possible mechanism supporting this growth.

In their paper, Stranger Danger! Cross-Community Interactions with Fringe Users Increase the Growth of Fringe Communities on Reddit, that recently won Best Paper Award at the 18th International AAAI Conference on Web and Social Media (ICWSM), scientists from the Data Science Laboratory (DLAB) have outlined fringe-interactions – the exchange of comments between members and non-members of fringe communities. Applying text-based causal inference techniques to study the impact of fringe-interactions on the growth of fringe communities, the researchers focused on three prominent groups on Reddit: r/Incel, r/GenderCritical and r/The Donald.

“It’s well known that there are fringe communities on the big online social platforms and equally known that these platforms try to moderate them, however even with moderation these communities keep growing. So, we wondered what else could be driving this growth?’ explained Giuseppe Russo, a postdoctoral researcher in DLAB.

“We found that fringe-interactions – basically entailing an exchange of comments between a ‘vulnerable user’ and a user already active in a fringe community – were bringing people on the edge towards these groups. And the comments aren’t obvious, like ‘join my community’ they are topical discussions that attract the attention of other people and a clever way to skip moderation policies,” he continued.

Users who received these interactions were up to 4.2 percentage points more likely to join fringe communities than similar, matched users who did not. This effect was influenced by the characteristics of communities where the interaction happens (e.g., left vs. right-leaning communities) and the language used in the interactions. Toxic language had a 5 percentage point higher chance of attracting newcomers to fringe communities than non-toxic interactions.

Once a vulnerable user had engaged in an exchange of comments, finding a fringe community online was just a click away, as every user profile immediately shows the groups in which someone has been active. The researchers also found that this recruitment method is unique to fringe communities – they didn’t find it occurring in climate, gaming, sporting or other communities – supporting the hypothesis that this could be an intended strategy.

“The recruitment mechanism is very simple, it’s literally just one exchange of comments. One small discussion is enough to drive vulnerable users towards these bad communities and once they are there, many previous studies have shown that they radicalize and become active members of these extremist communities,” said Russo.

The research results raise important questions – is this recruitment mechanism relevant enough to warrant the attention of online platforms’ moderation policies and, if yes, how can we stop it from helping fuel the growth of these communities?

Based on the observational period considered the researchers estimate approximately 7.2%, 3.1% and 2.3% of users that joined r/Incels, r/GenderCritical, and r/The Donald respectively, did so after interacting with members of those groups. They say that this suggests that community-level moderation policies could be combined with sanctions applied to individual users, for example reducing the visibility of their posts or limiting the number of comments they can make in more susceptible communities. The researchers believe that these measures may diminish the impact of fringe-interactions and slow down the growth of fringe communities on mainstream platforms.

“From a societal perspective, I think this is extremely relevant for many reasons, one of which is that we have seen that the impact of these fringe communities is not exclusively online, there is an offline impact too. We have seen riots and terrorism, we have seen women targeted and killed by members of these communities, so by reducing access to these communities we definitely reduce the risk offline which is what we really care about in the end,” concluded Russo.

Author: Tanya Petersen

Share

You might be also interested in

AI helps marine scientists track floating debris from space

Under the EPFL-led ADOPT project, researchers are combining AI satellite-image recognition with drift prediction models to improve the collection of plastic debris in the ocean. The technology has passed the proof-of-concept stage and is ready for field testing. 

(more…)

EPFL joins $25 million research on cancer avoidance

Jacques Fellay’s lab at EPFL will lead data integration and analysis within the Cancer Grand Challenges ATLAS team, selected to investigate why some high-risk individuals never develop cancer.

(more…)

Open-Source LLM Builders Summit: What It Will Take to Enable Global Collaboration

The second Open-Source LLM Builders Summit concluded in Lausanne with a final plenary discussion led by the Scientific Committee (Antoine Bosselut, EPFL; Joost VandeVondele, CSCS; Imanol Schlag, ETHZ; Valentina Pyatkin, Ai2) synthesizing insights from the afternoon’s breakout sessions. The goal of the workshop was to explore how meaningful global collaboration on open large language models (LLMs) could move from aspiration to implementation.

(more…)