Just a few short years ago the word “troll” in the context of the internet had not even entered our lexicon. Now, you can enter a well-paid career in the distasteful practice, especially if you live in Russia. You have to admire the human ability to find innovative and profitable ways to inflict pain on others.
Around 8:30 a.m. on Sept. 11 last year, Duval Arthur, director of the Office of Homeland Security and Emergency Preparedness for St. Mary Parish, Louisiana, got a call from a resident who had just received a disturbing text message. “Toxic fume hazard warning in this area until 1:30 PM,” the message read. “Take Shelter. Check Local Media and columbiachemical.com.”
St. Mary Parish is home to many processing plants for chemicals and natural gas, and keeping track of dangerous accidents at those plants is Arthur’s job. But he hadn’t heard of any chemical release that morning. In fact, he hadn’t even heard of Columbia Chemical. St. Mary Parish had a Columbian Chemicals plant, which made carbon black, a petroleum product used in rubber and plastics. But he’d heard nothing from them that morning, either. Soon, two other residents called and reported the same text message. Arthur was worried: Had one of his employees sent out an alert without telling him?
If Arthur had checked Twitter, he might have become much more worried. Hundreds of Twitter accounts were documenting a disaster right down the road. “A powerful explosion heard from miles away happened at a chemical plant in Centerville, Louisiana #ColumbianChemicals,” a man named Jon Merritt tweeted. The #ColumbianChemicals hashtag was full of eyewitness accounts of the horror in Centerville. @AnnRussela shared an image of flames engulfing the plant. @Ksarah12 posted a video of surveillance footage from a local gas station, capturing the flash of the explosion. Others shared a video in which thick black smoke rose in the distance.
Dozens of journalists, media outlets and politicians, from Louisiana to New York City, found their Twitter accounts inundated with messages about the disaster. “Heather, I’m sure that the explosion at the #ColumbianChemicals is really dangerous. Louisiana is really screwed now,” a user named @EricTraPPP tweeted at the New Orleans Times-Picayune reporter Heather Nolan. Another posted a screenshot of CNN’s home page, showing that the story had already made national news. ISIS had claimed credit for the attack, according to one YouTube video; in it, a man showed his TV screen, tuned to an Arabic news channel, on which masked ISIS fighters delivered a speech next to looping footage of an explosion. A woman named Anna McClaren (@zpokodon9) tweeted at Karl Rove: “Karl, Is this really ISIS who is responsible for #ColumbianChemicals? Tell @Obama that we should bomb Iraq!” But anyone who took the trouble to check CNN.com would have found no news of a spectacular Sept. 11 attack by ISIS. It was all fake: the screenshot, the videos, the photographs.
In St. Mary Parish, Duval Arthur quickly made a few calls and found that none of his employees had sent the alert. He called Columbian Chemicals, which reported no problems at the plant. Roughly two hours after the first text message was sent, the company put out a news release, explaining that reports of an explosion were false. When I called Arthur a few months later, he dismissed the incident as a tasteless prank, timed to the anniversary of the attacks of Sept. 11, 2001. “Personally I think it’s just a real sad, sick sense of humor,” he told me. “It was just someone who just liked scaring the daylights out of people.” Authorities, he said, had tried to trace the numbers that the text messages had come from, but with no luck. (The F.B.I. told me the investigation was still open.)
The Columbian Chemicals hoax was not some simple prank by a bored sadist. It was a highly coordinated disinformation campaign, involving dozens of fake accounts that posted hundreds of tweets for hours, targeting a list of figures precisely chosen to generate maximum attention. The perpetrators didn’t just doctor screenshots from CNN; they also created fully functional clones of the websites of Louisiana TV stations and newspapers. The YouTube video of the man watching TV had been tailor-made for the project. A Wikipedia page was even created for the Columbian Chemicals disaster, which cited the fake YouTube video. As the virtual assault unfolded, it was complemented by text messages to actual residents in St. Mary Parish. It must have taken a team of programmers and content producers to pull off.
And the hoax was just one in a wave of similar attacks during the second half of last year. On Dec. 13, two months after a handful of Ebola cases in the United States touched off a minor media panic, many of the same Twitter accounts used to spread the Columbian Chemicals hoax began to post about an outbreak of Ebola in Atlanta. The campaign followed the same pattern of fake news reports and videos, this time under the hashtag #EbolaInAtlanta, which briefly trended in Atlanta. Again, the attention to detail was remarkable, suggesting a tremendous amount of effort. A YouTube video showed a team of hazmat-suited medical workers transporting a victim from the airport. Beyoncé’s recent single “7/11” played in the background, an apparent attempt to establish the video’s contemporaneity. A truck in the parking lot sported the logo of the Hartsfield-Jackson Atlanta International Airport.
On the same day as the Ebola hoax, a totally different group of accounts began spreading a rumor that an unarmed black woman had been shot to death by police. They all used the hashtag #shockingmurderinatlanta. Here again, the hoax seemed designed to piggyback on real public anxiety; that summer and fall were marked by protests over the shooting of Michael Brown in Ferguson, Mo. In this case, a blurry video purports to show the shooting, as an onlooker narrates. Watching it, I thought I recognized the voice — it sounded the same as the man watching TV in the Columbian Chemicals video, the one in which ISIS supposedly claims responsibility. The accent was unmistakable, if unplaceable, and in both videos he was making a very strained attempt to sound American. Somehow the result was vaguely Australian.
Who was behind all of this? When I stumbled on it last fall, I had an idea. I was already investigating a shadowy organization in St. Petersburg, Russia, that spreads false information on the Internet. It has gone by a few names, but I will refer to it by its best known: the Internet Research Agency. The agency had become known for employing hundreds of Russians to post pro-Kremlin propaganda online under fake identities, including on Twitter, in order to create the illusion of a massive army of supporters; it has often been called a “troll farm.” The more I investigated this group, the more links I discovered between it and the hoaxes. In April, I went to St. Petersburg to learn more about the agency and its brand of information warfare, which it has aggressively deployed against political opponents at home, Russia’s perceived enemies abroad and, more recently, me.
Read the entire article here.