Tsunami of Russian Social Media Trolling
Feb. 23rd, 2018 09:45 am![[personal profile]](https://www.dreamwidth.org/img/silk/identity/user.png)

Then the Russians (sometimes operating in the open) settled in to creating material for conspiracy theorists, providing fake news stories and videos alleging that the shootings never took place, or in some cases, did take place and were carried out by the government. The related hashtag #falseflag was trending for a few days. What was happening was that the Russian Internet Research Agency, along with similar organizations from countries outside of Russia, was seizing on bizarre theories already on social media and then amplifying them through the use of fake accounts, bots and automated postings that could have hundreds of accounts say the same thing at the same time. At the same time, video of participants in the events were coupled with misleading headlines and used as “evidence.” But while the Russian trolls and their collaborators were hard at work following the shooting, as they are after many significant events, including the indictment last week of Russian trolls, this time the social media activities seemed to die out quickly, and many of the discussions quietly vanished.
It turns out that the social media were having a practice run of their own. Last year, Twitter announced its approach to bots and misinformation, saying that it would begin building new tools to detect bot activity, and removing misinformation. A few days after the shootings, Twitter carried out a bot purge, removing all accounts that exhibited bot-like behavior, which was about 50,000 of them. Then today, Twitter announced its policy on automation and the use of multiple accounts, which forbade simultaneous Tweeting and a list of other related activities that were used by the Russian trolls and others during the 2016 election, as was indicated in the Department of Justice’s Special Counsel’s commentary explaining the Russian indictments.
Facebook, meanwhile, was taking actions of its own. “Images that attack the victims of last week's tragedy in Florida are abhorrent,” said Mary deBree, Facebook’s head of content policy in an email. “We are removing this content from Facebook.” Indeed, the fake news and conspiracy theory content that I’d spotted a few days ago on Facebook was gone. According to a Facebook spokesperson, the social network is in the process of rolling out a feature that will let users report fake news. While the feature is still being implemented, you can find out how to use it by searching for “reporting fake news” in the help files.
But whether this goes far enough is open to question. The Washington Post is reporting that YouTube’s top trending story is a conspiracy theorist video charging one of the Parkland survivors with being a “crisis actor” thus proving that the shooting never happened. The student has since tried to refute that allegation.
Unfortunately, the battle for social media is already getting ready to escalate with the ability to produce fake news videos. A team of scientists and engineers at the University of Washington has demonstrated the ability to create a photo-realistic image of someone for whom there’s a supply of video and make them say anything they want. For their tests they used former President Barack Obama, but it can be anyone else's images that can be manipulated with artificial intelligence to create a fake video. A spokesperson said that an AI application was able to analyze Obama’s speech patterns from 14 hours of video. The spokesperson said that the same AI technology can also detect fake video. “But even when fake news gets debunked down the road, it can still have profound and damaging effects; what will happen if—but more probably, when—someone makes a fake video of President Donald Trump declaring war on North Korea?” the spokesperson wondered in an email. If there’s anything positive about the social media manipulation and related fake news that’s surfaced following the tragedy in Parkland, it’s that we had some idea of what to expect.
While there are still many who will gladly consume fake postings and misinformation because they agree with it, most people won’t. Instead, they now know what fake news and misinformation looks like and they can do something about it. Now the challenge is to be ready for the next round of fake news and social media manipulation that is even more convincing.