The term “internet troll” has been around since the ‘80s and ‘90s to describe individuals who would derail conversations on Usenet with spam and harmful content. Far from the small, colorful, and incessantly happy animated creatures we’ve seen on the big screen, real-life trolls work to provoke and manipulate others, either for their own amusement or a specific political goal.
And as the internet grew over the past few decades, so did they.
Not just an angry acquaintance on Facebook or a Twitter user with an egg for a profile picture, trolling today has evolved into an elaborate, industrialized, and global art form, often coordinated in what we now know as troll farms.
These are sophisticated operations that involve groups of people whose job it has become to sow disinformation online in an effort to influence people and governments. Some work in physical offices, while others are paid on a per-day basis as freelancers. Together, workers in these farms amplify key messages, using fear and anger in order to encourage real users to engage and believe in them.
If you spend any amount of time on the internet, you’ve probably seen or heard from one. In the lead up to the 2020 elections, for instance, a Facebook internal report found that troll farms were able to reach some 140 million Americans, or just under half of the population, every single month.
In recent years, journalists, governments, and platforms have had to take on the tall order of addressing the problem of troll farms. But uncovering the truth behind operations specifically designed to manufacture and distribute lies is no easy feat. And things can get complicated pretty quickly. While working on a 2015 piece on Russian troll farms, for example, investigative journalist Adrian Chen was attacked by a sophisticated disinformation campaign himself.
But thanks to investigative reports and testimonies, bits and pieces about what life is like inside troll farms and the world of networked disinformation are coming to light.
A Global Industry of Manufactured Noise
Aside from the Russian troll farms that influenced the 2016 elections, recent reports have pointed to similar operations in developing countries where internet use is growing much faster than jobs and wages, such as the Philippines, Nigeria, Venezuela, North Macedonia, and Ghana, among others. But they’ve also made their way to America, and their operations have become globalized.
For instance, earlier this year, a report by Digital Africa Research Lab and BuzzFeed News revealed how a troll farm in Nigeria, operated by a local PR firm, was paid by a UK-based non-profit to tweet support for a businessman from Columbia who was being investigated for money laundering in the US. Twitter then suspended over 1,500 accounts that were found to be part of the operation.
“Behind the madness is an invisible machine,” wrote Dr. Jonathan Corpus Ong and Dr. Jason Vincent Cabañes, whose work on fake news production in the Philippines investigated why trolling has become so professionalized and institutionalized. It is “industrial in its scope and organization, strategic in its outlook and expertise, and exploitative in its morality and ethics.”
Centralized, Hierarchical Operations
In several in-depth reports, troll farm operations have been found to be centralized, with clear management hierarchies and teams that operate entirely separate from each other.
Katarczyna Pruszkiewicz, an undercover reporter who spent six months working at a Polish troll farm, describes the operations as secretive, controlled, and specific. After signing her contract, Pruszkiewicz was tasked to start a Twitter account and to maintain the false identity tied to it. She tweeted about politics and everyday life to keep things “credible and convincing,” while two managers tell workers what kinds of posts to react to, which people to target, and how.
Trolls in other farms across the globe start similarly. A 2018 New York Times interview with a former Russian troll revealed how the first step for them was to create three identities on LiveJournal, which remains popular in Russia. There, managers also assigned the issues to be covered, the people to promote, and the people to defame at the start of every workday.
The pay for this type of work varies, from $1 per post, to $777 monthly, to $1,400 weekly. Others get paid in cell phones or other gifts.
After she was promoted to manager, Pruszkiewicz’s tasks then included collecting all output for the day — comments, posts, and replies — in an Excel file. Reports were sent to superiors daily, weekly, and monthly.
Often, at the top of the hierarchies are executives from advertising and public relations firms. Chen’s report points to how government-backed disinformation relies on the same firms that work on corporate social media marketing.
Similarly, Ong and Cabañes’ research found that leaders at local ad agencies, whom they call “chief architects,” are able to use tried-and-tested industry techniques — usually for promoting household brands and celebrities — for the purposes of political disinformation. And indeed, the troll farm Pruszkiewicz worked in describes itself as an “ePR agency.”
The Art of Not Getting Caught
Over the years, troll farms have also developed ways to bypass platform regulations and learn from the mistakes of failed operations. One example is China’s infamous “fifty-cent army,” or state-sponsored trolls who were subjected to rigid arrangements and were made to emphasize quantity over quality so much that it became very easy to identify them. Ironically, all their efforts made people trust the state even less.
Today’s trolls are much, much better at deception. Ross Tapsell, whose research took him to the troll farms in the Philippine province of Cebu, tells the Los Angeles Times that when done right, troll accounts “can look like legitimate Facebook users to trick the [company’s] artificial intelligence.”
For instance, trolls steal photos of real people for their display pictures, mix in inspirational quotes and the everyday thoughts of an imaginary person with their disinformation, and make sure to add and interact with people who aren’t trolls. They’re also instructed to push a key message with their own words. This way, they can better avoid detection from platform algorithms, which often check for copy-pasted comments.
They’re also continuously monitoring these algorithms. Cody Buntain, assistant professor in informatics at New Jersey Institute of Technology, points out that when social media platforms ban troll accounts, troll farms study what exactly alerted the platforms to their presence. This way, they know to avoid similar behavior in the future.
Gruelling, Exploitative Work
Trolling is a numbers game. In Pruszkiewicz’s case, she found that a small team of a dozen or so people could command at least 70 accounts on Facebook, 94 on Twitter, 11 on Instagram, and three on YouTube. Collectively, these can create 10,000 posts defending a client or attacking their rivals, potentially reaching 15 million views.
All this is made possible by cheap labor. Many of the trolls interviewed in the reports worked 12-hour shifts, in rooms with dozens of people, and high quotas to meet daily. They are tasked with creating elaborate blog posts, posting social media updates, or commenting in community groups, news sites, and the social media pages of news and rival politicians. Often, it is a combination of all these.
For many of these workers, who got into disinformation work at the promise of higher-than-usual salaries, the exact job description was unclear, especially in the beginning. Some, as Ong and Cabañes’ research revealed, were coerced into troll work because they were already employees of a government agency or a politician. In cases like these, the work can sometimes even go unpaid.
As to whether or not they believe the messages they publish en masse, it can vary from worker to worker. Some truly believe and end up becoming cheerleaders for their clients, and this can show in the quality and quantity of their work. Meanwhile, others resist and consider the job of promoting a politician or regime as just that — a job.
This is why outside of the sheer quantity of content they produce, these workers also carry the emotional labor of morally justifying the nature of their work. Disinformation workers on the lowest rungs of the hierarchy are often burdened by economic hardship, and have had to work more precarious jobs for less pay in the past.
Others up the ladder, meanwhile, reason that they have bent the truth and executed the same strategies for “legitimate” clients anyway, as in the case of brands or even celebrities. These practices have been so standard that it’s easier to explain away political disinformation work in this way.
Last but not least, Ong and Cabañes explain that chief architects, or those at the top level of the hierarchy, tend to explain away the moral complication of disinformation work as a game. They may express excitement over rewriting the rules of social media, or compare themselves to controversial fictional characters like Scandal’s Olivia Pope.
Preying on the Poor and Disenfranchised
Because of the budgets given by clients and the nature of the work, troll farms often take advantage of the economic precarity of its workers. For instance, Ong and Cabañes’ report found that those on the lowest levels of the disinformation hierarchy are often driven by financial need. Many have worked less stable and less rewarding jobs in the past, and are lured in by material and symbolic rewards, like the opportunity to work in five-star hotel suites or to get the latest iPhone.
Meanwhile, Pruszkiewicz found that many of her colleagues at the troll farm — grouped together in a Slack channel called “Kulawa Rebelia,” or “Rebellion on Crutches” — were disabled, and would not have been able to earn a living for themselves otherwise. “Many of them are really good people,” Pruszkiewicz tells the Guardian. “They are compassionate, they do charity work and engage in social activism in their spare time, but their disabilities mean that their employment opportunities are limited.”
These workers often bear the brunt of public anger over disinformation, while chief architects at the helm of troll farms across the globe are able to maintain respectability and avoid accountability.
No Easy Answers
In the decades since the first trolls of Usenet, trolling has become a professionalized and normalized industry across the globe — with its own logic, hierarchies, and best practices.
Indeed, the research described here only scratches the surface of the emerging global economy of networked disinformation, and solving the problem of industrialized troll farms will likely be an ongoing project for platforms and governments for years to come.
Key in this fight should be addressing the economic hardship that makes people turn towards disinformation work, and holding accountable not only the chief architects that organize troll farms, but also the individuals, organizations, and regimes that comprise the market for their service.
Facebook public policy director for global elections Katie Harbath once described the Philippines as “patient zero in the global disinformation pandemic,” so if you’d like to learn more about the inner workings of troll farms, I highly recommend Ong and Cabañes’ report, Architects of Networked Disinformation: Behind the Scenes of Troll Accounts and Fake News Production in the Philippines. You can read it for free thanks to the Newton Tech4Dev Network.