In the three months leading up to the 2016 US Presidential Election, a false story claiming that President Obama had signed an order banning the pledge of allegiance received 2.2 million reactions, comments, and shares on Facebook. A surge of similar fake news articles, coupled with the surprising presidential victory of Donald Trump, sparked US intelligence investigations into potential foreign interference through social media channels. In 2017, it was proven that Russian agents had purchased thousands of ads on Facebook, commenting on a wide range of social and political issues with a distinct bias promoting a pro-Trump agenda. By discussing the most controversial and inflammatory topics, these ads sought to facilitate a divided and disoriented political climate in the United States. Although Trump eventually conceded that Russia interfered in the US election, it was too little too late. While it is difficult to pinpoint the exact impact of Russian social media intervention on the 2016 Election, it was undeniably significant. Moving past the 2016 Election, we must consider the dangerous role of social media in spreading misinformation to the masses.
On September 25th, McGill’s Max Bell School of Public Policy hosted an event in which Kevin Chan, the global director and head of public policy for Facebook Canada, spoke about Facebook’s quest to protect election integrity. Mr. Chan came across as honest, owning up to Facebook’s prior mistakes, and offering pragmatic solutions to prevent the spread of fake news. Some of these measures included using artificial intelligence to detect and remove false content, as well as outsourcing independent fact checkers to monitor the spread of news. If content is deemed factually incorrect by third party fact checkers, the distribution of said content across Facebook will fall by more than 80%, as noted by Chan. Although these measures are a step in the right direction, a constant re-evaluation of Facebook’s algorithms and protective measures is needed to preserve the factuality of news on social media. Facebook must constantly update their security protocols as malicious actors adapt their means of spreading doubt and misinformation.
If content is deemed factually incorrect by third party fact checkers, the distribution of said content across Facebook will fall by more than 80%
For readers, the recent deluge of fake news serves as a reminder to remain cognisant of the sources where we read our news online. As social media platforms implement more thorough measures to combat fake news, complacency cannot arise on the part of the user; these measures will end up redundant if readers do not remain diligent in assessing the validity of the information they consume. Facebook must manage a difficult balancing act; ensuring that deceptive articles are not posted without punishment, while refraining from becoming an arbiter of which news is — and is not — reputable. The measures implemented thus far are a promising start to the long rebuilding process Facebook must undergo to regain the trust of its users. In the fight for the preservation of facts, fake news will remain a force to be reckoned with, but Facebook appears well positioned to win the battle.