Monday , October 14 2019
Home / canada / How incels fit into a global far-right ecosystem of angry young men

How incels fit into a global far-right ecosystem of angry young men



On one of Toronto's worst days – as victims lay in hospital beds, families received devastating news about loved ones who survived – they cheered.

Across the internet members of a deeply misogynistic subculture who call themselves incalculable, short for involuntary celibates, welcomed the news of what is now known as the van attack.

One even hailed the killer as their "new saint," according to images provided by the Star by the Southern Poverty Law Center, a U.S. legal advocacy non-profit.

"Joyous day," said another.

The 26-year-old white van driver who plowed down pedestrians on a busy North York stretch of Yonge Street that day in April 2018, identified himself as part of this "movement" without borders, and said he was "radicalized" online and inspired by his cult figures to complete his "mission," according to a transcript of an interview with police recently made public. Alek Minassian will stand trial without a jury in Toronto on 10 counts of murder and 16 counts of attempted murder next year.

Incels, experts say, are a rising threat, part of a global far-right ecosystem of angry young men who have been radicalized online and committed to a rash of recent attacks from Christchurch, New Zealand to El Paso, Texas. Not all of these men are incels, but they are part of this larger web.

And, experts argue, not much has changed since these attacks on tech companies and platforms that provide the forums for this credit.

The Southern Poverty Law Center started tracking incels, under the banner of male supremacy, in 2018. There have always been misogynists, says its intelligence project director Heidi Beirich. But this is something new.

YOU MIGHT BE INTERESTED IN …

"What has happened over recent years is that members of the white supremacist milieu or the alt-right, whatever you want to call it, have increasingly come to terms with their extremism by an online path that almost always involves extreme misogyny," she said over the Phone from the center's head office in Montgomery, Alabama.

"You now have this embedded radicalization of largely young white men, but not entirely, who harbor deep hatred of women and are often coupled with a lot of extremist beliefs," she adds, "this is an increasing threat to domestic terrorism."

Elliot Rodger, who, in 2014, killed six and injured 14 in California, inspired many incel attacks.

Neo-Nazis in the past, while not recognizing women as their equals, saw them as worthy of protection. "This new crop of extremists is definitely not doing that," notes Beirich.

The term incel was coined by a Canadian woman in the early 1990s who created a community of lonely people struggling to find connection. But they have now been overtaken by men with a deep hatred of women who find solace online.

They believe that women who refuse to have sex with them deserve violence. The Stacys (attractive women) prefer the Chads (attractive men) over the incels, which are at the bottom but deserve to be at the top. In the middle are the “normies,” regular people.

This ideology has "accelerated into multiple attacks" in recent years, beginning with self-proclaimed incel Elliot Rodger, who killed six people and injured 14 near the University of California, Santa Barbara, in 2014, says Beirich.

Police also found the 29-year-old killer in the summer of 2018 Danforth shooting had a potential interest in incel culture, though they did not find a clear motive or association with terrorist or hate groups. A nearly yearlong investigation into the shooting revealed Faisal Hussain had a copy of a misogynistic Rodger manifesto left behind.

Has been a surgeon in attacks linked to the far-right broadband network over the past few months. These men do not call themselves incels, but are part of the larger trend of angry young men radicalized online.

Members of this diffuse global community meet, communicate and inspire each other online, ranging from niche message boards to mainstream sites used by billions.

The Christchurch shooter – who killed 51 people – streamed part of his March attack on two mosques on Facebook Live. He also penned a white supremacist manifesto that was shared on Twitter and 8chan, an anonymous message board popular with racists and misogynists.

The following month a man in Poway, Calif., Posted a racist anti-Semitic letter on 8chan before allegedly killing one and injuring three in a synagogue shooting on the last day of Passover.

Then in August 2019, a Dallas man went to a busy Walmart in El Paso. Has been reported in U.S. media that police believe he was inspired by Christchurch, deliberately targeted Hispanic people and posted a racist anti-immigrant manifesto on 8chan before the attack. Twenty-two people were killed, and 24 injured, including a two-month-old baby whose parents died trying to shield him from the bullets. The shooting is being investigated by the FBI as a possible domestic terrorist attack and was a crime.

After El Paso, 8chan's own founder called for it to shut down, according to the New York Times. Cloudflare, an internet security company, cut off its support in August and the site, described on its Twitter profile as "The Darkest Reaches of the Internet," is now offline.

Just a week later, a 21-year-old Norwegian man allegedly killed his sister and stormed a local mosque, wounding one person. The Guardian reported that, this time, he left messages on a new message board called Endchan, saying he was inspired by Christchurch and El Paso. Being investigated as an act of terrorism. Endchan has been offline in recent days. After the attack, its administrators tweeted they were recently hit by "a large influx of 8chan refugees … drastically changing the pace at which the site operates."

It can be hard to squash every smaller site that takes in "people who have been booted off Facebook and Twitter with hateful views," Beirich says. As soon as one cracks down or goes dark, the worst people on it pop up somewhere else.

But smaller sites have fewer users and are often "preaching to the choir," Beirich adds. More mainstream sites like Facebook, Twitter, and Google (which owns YouTube) can have a huge impact and reach billions.

Those bigger tech companies need to step up, as well as places where new people will be recruited and radicalized, she says.

Until the August 2017 white supremacist rally in Charlottesville, Virginia, tech companies were not recognizing white and male supremacist as a problem, Beirich adds. But "now is the kind of conversation you have, why is your implementation so terrible?"

The worst posts cheering on the killer on the day of these attacks were from a now defunct niche website called incel.me. Minassian said in a police interview that he was "radicalized" on Reddit and 4chan. Hours before the attack he posted on 4chan using coded incel language announcing an imminent attack, hoping to inspire others. But it was on the much more mainstream Facebook that he left his last message.

Reddit took steps to curtail incels in November 2017 by taking down a forum devoted to them, and earlier that fall announced a new policy to ban content that incites, encourages, or glorifies violence.

"Communities focused on this content and users who post such content will be banned from the site," added a spokesperson for the company.

Representatives from 4chan did not respond to requests from the Star for comment. Requests for comment to an administrator email and Twitter account associated with 8chan were not returned.

A Facebook spokesperson responded that "individuals and organizations who spread, attack, or call for the exclusion of others on the basis that they have no place on our services."

YOU MIGHT BE INTERESTED IN …

The social network's policy on dangerous individuals and organizations states that they will not allow those who are 'engaged in' organized hate. '”It continues to review“ individuals, pages, groups and content ”that breach its community standards.

YouTube Canada spokesperson Nicole Bell wrote in an email that hate speech and content promoting violence had "no place" on the platform, and that the company was "heavily invested" in both humans and technology to quickly detect, review and remove this content.

“Since the Toronto van attack in 2018, we have been taking a close look at our approach to hateful content in consultation with dozens of experts in subjects such as violent extremism, supremacist, civil rights, and free speech, and as a result of that We have announced major changes in June to tackle these issues, ”she added.

A spokesperson for Twitter referred the Star to their global policy strategist's U.S. congressional testimony from June 2019 where he explained that they suspended more than 1.5 million accounts for terrorism-related violations from August 2015-2018 and have seen a steady decline in terrorist organizations trying to use their service over the years.

Stephanie Carvin, an assistant professor of International Relations at Carleton University, finds finding, reviewing, and removing this content difficult. But, she notes, that has been done before.

"It's always going to be the dark corners of the internet, but we have to be successful about taking down Islamic State propaganda," she says.

“The far right is far more affluent and far less cohesive. But still, it should be easy to identify the nodes of these networks. ”

Get more of today's top stories in your inbox

Sign up for the Star's Morning Headlines newsletter for a summary of the day's big news.

Sign Up Now

Carvin says anonymous online communities provide a form for these men that pushes them toward violence.

"You're daring each other to do more and more extreme things," she says.

“These individuals are carrying out attacks. They're killing lots of people, transnational links, inspiring each other. ”

She says social media companies need to do a better job of enforcing their own terms and conditions. "It's hard but you run a business, is this how you want your business to be used?" She asks.

The companies depend on user-reporting, artificial intelligence and human judgment calls by moderators to enforce their policies. But there have been numerous reports that this is not done consistently.

A 2017 investigation by ProPublica found Facebook's “uneven” enforcement of hate speech policies, and after asking the social media giant about its handling of 49 offensive posts, the company acknowledged its content reviewers had made the wrong call on nearly half of them. Another investigation by the U.S. non-profits earlier that year found Facebook's policies tend to favor governments and elites over individuals with less power. Reuters found more than 1,000 examples of posts, comments and pornographic images attacking the Rohingya and other Muslims that were still on Facebook in 2018, despite Mark Zuckerberg's assurances that the company was cracking down.

CNBC reported in August that Twitter users were switching their country location to Germany, where local laws required companies to pull down Nazi content quickly in order to escape online anti-Semitism and racism they are still experiencing on the site.

In his interview with police, Minassian said he did not deliberately target women. He said he just saw a crowded area and decided to “go for it.” But he referred to two incel mass killers in the transcript: Elliot Rodger and Chris Harper-Mercer, who killed 10 people at an Oregon community college in 2015. He said he communicated with both online, and inspired a man in Edmonton to commit an attack.

These claims have not been independently verified by the Star.

In the last year or so since these attacks, law enforcement agencies have begun to recognize incidents as a new public safety threat, Carvin says.

CSIS referred to the van attack, as well as the 2016 Quebec mosque shooting, in its 2018 annual public report, published in June, under the heading of "Right Wing Extremism."

The move, Carvin says, signals a new priority.

While Christchurch catalyzed this shift, the van attack "may have been the beginning of the momentum," she says.

The incel ideology is not as clear-cut as that of other terrorist movements, "and more than a collection of random grievances aimed at women in general," she adds. But, she notes, under the Criminal Code, terrorist acts can be committed "In whole or in part for a political, religious or ideological purpose, objective or cause … with the intention of intimidating the public."

"These guys are drawing their ideas in part" online, she says.

That threat is not being taken more seriously by companies and society as a whole because of the normalization of rap culture and violence against women, says Nicolette Little, a critical media studies researcher at the University of Calgary.

“If you look at some of the uproar that might happen around what people think is more standard than a terrorist attack and compare it to the kind of uproar or lack of it around that kind of thing, I think that's a really interesting point to consider, "She says.

"It looks like these events are happening, like this van attack, and there is a lot of willingness to do something but then it just fades so quickly."

She worries about copycat attacks, and notes that the van attacker evoked Rodger in his Facebook post.

"Digital media is a beast that really has trouble controlling and understanding how to control," she says.

"Not a one-on-one fight, not a single forum, a much wider network of women-haters online."

At the same time, she cautions against giving these men too much oxygen, and even questions the continued use of the name they have given themselves. Some western leaders and media have started using the term Daesh instead of Islamic State or ISIS because of a derogatory label that delegitimizes the terrorist group, she notes.

"I think we might want to step away from the term incel," Little says.

"And start calling them what they are, which are really angry loathsome misogynists who are doing terrible things out of a strange mix of self-loathing and hatred of women."

May Warren


Source link