
In this article:
- “Alt-right pipeline” is a term used to the process by which people are indoctrinated into extremist right-wing ideologies online.
- The most disturbing part of it, though, is that social media algorithms rather than actual alt-right individuals are more to blame for the gradual pull of a person toward increasingly radical content and ideas.
- Algorithms built to deliver more of what you might like base that decision, in part, on the interests of others who watch the same content as you. If alt-right radicals share your interest in self-help videos, you might see increasingly radical content in your suggestions.
- If you or someone you know is being pulled down this alt-right pipeline, it is possible to come back out. But it takes patience, kindness, and a new, healthier source of community and belonging.
“Alt-right pipeline” is a phrase that’s been thrown around online by YouTube vloggers, Twitter users, and Redditors for the past couple of years or so, but the flippancy of its use often obscures its exact meaning. Look, it’s the internet.
Everything becomes a joke at some point. But with the increasing polarization of political and ideological views, both online and offline, understanding what the alt-right pipeline is and why people get pulled in by it can go a long way towards bridging differences.
What Is the Alt-Right and What’s a Pipeline?

Let’s start with the basics and break the phrase “alt-right pipeline” into its component parts.
When we say alt-right, we’re specifically referring to a resurgent right-wing movement that’s characterized by its use of social media to disseminate radical right-wing ideologies. Your average alt-right person is more likely to have a secondary Twitter account for interacting with alt-right content than to have a bumper sticker that outright tells you where their ideological beliefs lean. Think QAnon and Twitter accounts with anime girl profile pictures (that is not a joke).
The alt-right is a loosely organized, if at all, movement so the specific views of its participants tend to vary.
Typically, though, many of them have anxieties regarding a perceived decline of society due to the influence of immigrants, feminism, homosexuals, transpeople, etc. Its main concern is the preservation of the “white identity” against “replacement” and upholding what they consider to be the ideals of Western civilization.
That brings us to the next part of the alt-right pipeline, the pipeline.
Pipeline is a term used to describe the gradual introduction of people considered to be uninitiated in a particular subculture/set of beliefs — or “normies” if you want to be really cringe about it — to the core of what that subculture is into or what beliefs a group truly stands for.
It also implies that there’s a subtle use of manipulation. By getting someone to like or agree to the mildest variants of a belief before introducing them to the next, more radical belief, people are more likely to agree than they would if you introduced them to the core ideology from the start.

By the way!
Did you know we’re launching a Kickstarter campaign? In the next few months, our campaign for ‘Gentle Jack: The Party Game for Bad Friends‘ goes live! Visit the official website or follow the Kickstarter page to stay in the loop.
What makes the alt-right pipeline unique is that it doesn’t require the participation of a singular person or group of people to be effective. All it needs is an algorithm that continuously introduces people to more and more polarizing content with each new piece of content that they view.
This brings us to the O.G of the alt-right pipeline: YouTube.
How the YouTube Algorithm and Social Media Platforms Radicalize Youth

Generally speaking, social media algorithms work by showing you content that you might like based on the content you liked before.
For example, let’s say you like a video about building computers or, at least, you view multiple videos within the span of several days about how to build computers. This tells the algorithm that you’re likely to be building a PC so it starts showing you advertisements for computer parts.
Depending on what else it knows about you, it can likely deduce your gender, age range, and interests. Maybe even something intimate, like whether or not you’re pregnant.
So let’s imagine you’re someone young, unhappy, and in need of guidance. It’s hard learning to navigate the world on your own so you hop on YouTube in search of someone who has a better idea of how the world works than you. Someone smart and sure of themselves. Someone who can tell you how to figure out your shit.
In the search results, you see a video thumbnail showing a group of confident-looking people sitting in front of microphones. You just know they are the someones you’re looking for so you click on it, watch the video, end up liking the content, and watch more of it.
A simple search for “How to Talk to the Opposite Sex” turns into suggestions for “What You Really Need to Know About Sexuality” which then becomes, if you’re watching the same content that a lot of radicalized young people have been watching, “The Market Value of Male and Female Sexuality.”
Every now and then they bring on a guest speaker who also has something super motivational to say but they have a new idea for you: It’s your fault your life sucks and you should take charge of it…but there’s someone making it hard for you.
It plants a little seed of doubt and paranoia that gets watered by the frustration you already have with the “state of things” or whatever that means.
Here’s the thing, though, you aren’t the only one watching that content. The people who are further down the pipeline than you, who have watched similar content, have taught the algorithm that people who like these videos also like more…extreme types of content.
A simple search for “How to Talk to the Opposite Sex” turns into suggestions for “What You Really Need to Know About Sexuality” which then becomes, if you’re watching the same content that a lot of radicalized young people have been watching, “The Market Value of Male and Female Sexuality.”
And before you know it, you’re judging people’s value exclusively on their “market value” (typically attractiveness for women, financial standing for men) and then this cluster of beliefs leads you to even more radicalized types of content that promote everything from reactionary nationalism to anti-trans violence.

This isn’t just a hypothetic scenario. These events have played out in real life for many unsuspecting young men who find themselves pulled into the world of the alt-right through videos that initially promised to help them grow as people.
Caleb Cain was only 26 years old when he found himself being drawn into radical alt-right beliefs through self-help-type content.
“I just kept falling deeper and deeper into this, and it appealed to me because it made me feel a sense of belonging,” Cain told the New York Times, “I was brainwashed.”
He isn’t the only one to go through something like this either.
Communities like r/IncelExit have cropped up online with the mission to give people what they need — guidance and a sense of belonging to a community — without introducing them to hateful radical beliefs.
Other isolated, lost young people online have shared similar stories about how they were radicalized on the internet by content that preyed on their confusion, dissatisfaction, and general feelings of unhappiness.
Communities like r/IncelExit have cropped up online with the mission to give people what they need — guidance and a sense of belonging to a community — without introducing them to hateful radical beliefs.
But YouTube, and the internet as a whole, isn’t all doom, gloom, and anti-anyone-not-like-me. In recent years, content creators have stepped up to provide people with the mental tools they need to at least be aware of when they’re being radicalized.
HealthyGamerGG is a YouTube channel run by Dr. Alok Kanojia, a psychiatrist who specializes in explaining mental health issues in gamer terms. Incidentally, he shares the same audience demographic with a lot of alt-right content creators: young, introverted men who are super into video games and other stereotypically “geeky” hobbies.
The only difference? Dr. K helps his audience unpack the temptation to take the black pill. The [insert color] pill is a term that references The Matrix and is used by alt-right believers to describe the decision to see “the truth.”
As you can guess, that truth is usually something hateful that provides a target for their personal worries and anxieties about the current sociopolitical climate.
What Can You Do if You Suspect You or a Loved One Is Being Radicalized?

Once someone starts buying into the alt-right pipeline it can be hard to get them to stop believing in it since any attempt to tell them that what they believe in is harming them and causing others around them harm only makes you look unaware of “the truth” to them.
Here’s what not to do: Be aggressive.
Aggression can take the form of judgment, something that you can easily make people feel by outright telling them they’re in the wrong. It gets people to pull their walls up and disconnect emotionally from you, making it less likely for them to listen.
Try to introduce them to a friend group with similar interests but one that you’re sure has generally well-adjusted members who aren’t likely to push them further down the alt-right pipeline.
As cheesy as it sounds, compassionate and non-judgmental listening is the best way to get through to people who are nursing hurts that have led them down the alt-right pipeline.
Another technique is to do what Dr.K does and provide another source of community and belonging that isn’t centered on hateful ideology. Try to introduce them to a friend group with similar interests but one that you’re sure has generally well-adjusted members who aren’t likely to push them further down the alt-right pipeline.
If you suspect that you are the one who’s starting to go down the alt-right pipeline, you’ll have to remain aware of how it intentionally feeds on your unhappiness. Keep away from content that amplifies these feelings of discontent by “validating” them.
Find a new community that can give you that sense of understanding you get from radical groups without the radical part and, if you really have to, tell someone so they can keep an eye out for you.
Buddy system, internet folks, it works for more than just school field trips.
Of course, these things are easier said than done so if you really don’t have the tools to help yourself or someone you care about, consider convincing you or them to talk to a mental health professional who can help them unpack the root cause of their unhappiness and frustration.