The Future

The Future

The Alt-right is recruiting depressed people

Alt-right figures are targeting vulnerable communities with videos and, unfortunately, it seems to be working.

A video on YouTube entitled “Advice For People With Depression” has over half a million views. The title is generic enough, and to the unsuspecting viewer, lecturer Jordan Peterson could even look legitimate or knowledgable — a quick Google search will reveal that he even spoke at Harvard once. But as the video wears on, Peterson argues that men are depressed and frustrated because they don’t have a higher calling like women (who, according to Peterson, are biologically required to have and take care of infants). This leaves weak men seeking “impulsive, low-class pleasure,” he argues. Upon first glance he certainly doesn’t seem like a darling of the alt-right, but he is.

Type “depression” or “depressed” into YouTube and it won’t be long until you stumble upon a suit-clad white supremacist giving a lecture on self-empowerment. They’re everywhere. For years, members of the alt-right have taken advantage of the internet’s most vulnerable, turning their fear and self-loathing into vitriolic extremism, and thanks to the movement’s recent galvanization, they’re only growing stronger.

“I still wonder, how could I have been so stupid?” writes Reddit user u/pdesperaux, in a post detailing how he was accidentally seduced by the alt-right. “I was part of a cult. I know cults and I know brainwashing, I have researched them extensively, you'd think I would have noticed, right? Wrong. These are the same tactics that Scientology and ISIS use and I fell for them like a chump.”

“NOBODY is talking about how the online depression community has been infiltrated by alt-right recruiters deliberately preying on the vulnerable,” writes Twitter user @MrHappyDieHappy in a thread on the issue. “There NEED to be public warnings about this. 'Online pals' have attempted to groom me multiple times when at my absolute lowest.”

“You know your life is useless and meaningless,” Peterson says in his “Advice” video, turning towards the viewer, “you're full of self-contempt and nihilism.” He doesn’t follow all of this rousing self-hatred with an answer, but rather merely teases at one. “[You] have had enough of that,” he says to a classroom full of men. “Rights, rights, rights, rights…”

Peterson’s alt-light messaging quickly takes a darker turn. Finish that video and YouTube will queue up “Jordan Peterson - Don't Be The Nice Guy” (1.3 million views), and “Jordan Peterson - The Tragic Story of the Man-Child” (over 853,000 views), both of which are practically right out of the redpill/incel handbook.

“The common railroad stages of 'helpful' linking to 'motivational speakers' goes 'Jordan Peterson ---> Stefan Molyneux ---> Millennial Woes,” writes @MrHappyDieHappy. “The first is charismatic and not as harmful, but his persuasiveness leaves people open for the next two, who are frankly evil and dumb.” Molyneux, an anarcho-capitalist who promotes scientific racism and eugenics, has grown wildly popular amongst the alt-right as of late. His videos — which argue, among other things, that rape is a “moral right” — are often used to help transition vulnerable young men into the vitriolic and racist core of the alt-right.

Though it may seem like a huge ideological leap, it makes sense, in a way. For some disillusioned and hopelessly confused young men, the alt-right offers two things they feel a serious lack of in the throes of depression: acceptance and community. These primer videos and their associated “support” groups do a shockingly good job of acknowledging the validity of the depressed man’s existence — something men don’t often feel they experience — and capitalize on that good will by galvanizing their members into a plan of action (which generally involves fighting against some group or class of people designated as “the enemy”). These sort of movements allot the depressed person a form of agency which they may never have experienced before. And whether it’s grounded in reality or not, that’s an addicting feeling.

According to Christian Picciolini, a former neo-nazi who co-founded the peace advocacy organization, Life After Hate, these sort of recruiting tactics aren’t just common, but systematically enforced. “[The recruiters] are actively looking for these kind of broken individuals who they can promise acceptance, who they can promise identity to,” Picciolini said in an interview with Sam Seder. “Because in real life, perhaps these people are socially awkward — they're not fitting in; they may be bullied — and they're desperately looking for something. And the ideology and the dogma are not what drive people to this extremism, it's in fact, I think, a broken search for that acceptance and that purpose and community.”

Taking vulnerable young people and putting them in a group of seemingly supportive and energetic ‘friends’ is also how ISIS or cults recruit. The dogma becomes second nature.

Some of the most toxic unofficial alt-right communities online have operated on this principle. r/Incels (which is now banned, thankfully), began as a place for the “involuntarily celibate” to commiserate, but quickly became the place for extreme misogynists to gather and blame their problems on women and minorities. “Men going their own way,” (MGTOW) was initially a space for men to commune and protect their sovereignty as dudes “above all else,” it devolved into an infinitely racist and misogynistic hellhole. Similar fates have befallen r/Redpill, r/MensRights, and countless others. Commiseration begets community begets a vulnerable trend towards groupthink.

While it’s easy to isolate purely hateful content, the type that preys upon the disenfranchised and uses much more insidious methods to bring them into the fold is much more difficult to manage on expansive platforms like YouTube. Particularly because the message being sent isn’t one of obvious in-your-face hate speech, or something so obviously objectionable, but rather more of a slow burn. It’s not the sort of thing you can train algorithms to spot — or at least, not yet — making the issue of containment that much harder to address.