Hitting the books: Modern social media has made misinformation so, so much worse

0

IIt’s not just this uncle who is no longer allowed at Thanksgiving who has been spreading misinformation online. The practice began long before the rise of social media – governments around the world have been doing it for centuries. But it is only in the modern age, powered by algorithmic recommendation engines designed to infinitely increase engagement, that nation states have successfully weaponized disinformation to such a high degree. In his new book Tyrants on Twitter: Protecting Democracies from Information WarfareDavid Sloss, a law professor at Santa Clara University, explores how social media sites like Facebook, Instagram and TikTok have become platforms for political operations that have very real and very serious consequences for democracy while advocating for governments to come together to create a global framework to regulate and protect these networks from information warfare.

David Sloss

Extract of Tyrants on Twitter: Protecting Democracies from Information Warfare, by David L. Sloss, published by Stanford University Press, ©2022 by the Leland Stanford Junior University Board of Trustees. All rights reserved.


Governments practiced misinformation long before the advent of social media. However, social media accelerates the spread of misinformation by allowing people to reach large audiences at low cost. Social media accelerates the spread of misinformation and disinformation. “Disinformation” includes any false or misleading information. “Disinformation” is false or misleading information that is deliberately designed or strategically placed to achieve a political objective.

The political objectives of a disinformation campaign can be foreign or domestic. The previous chapters dealt with foreign affairs. Let’s take a look at national disinformation campaigns here. The story of “Pizzagate” is a good example. In the fall of 2016, a post on Twitter alleged that Hillary Clinton was “the kingpin of an international child enslavement and sexual relationship ring.” The story quickly spread on social media, leading to the creation of a discussion forum on Reddit with the title “Pizzagate”. As various contributors embellished the story, they identified a specific Washington, D.C. pizzeria, Comet Ping Pong, as the base of operations for the child sex operation. “These bizarre and unsubstantiated claims quickly spread beyond the dark underbelly of the internet to relatively mainstream right-wing outlets such as Drudge Report and Infowars.” Alex Jones, the creator of Infowars, “has over 2 million YouTube followers and 730,000 Twitter followers; by spreading the rumors, Jones has dramatically increased their reach.” (Jones has since been banned from most major social media platforms.) Eventually, a young man who believed the story arrived at Comet Ping Pong with “an AR-15 semi-automatic rifle…and opened fire, discharging several cartridges”. Although the story has been debunked, “Pollsters found that more than a quarter of adults polled were either certain that Clinton was connected to the pedophile ring or that part of the story must be true.”

Several characteristics of the current information environment accelerate the spread of disinformation. Before the rise of the internet, major media companies like CBS and The New York Times had the ability to deliver stories to millions of people. However, they were generally bound by professional standards of journalistic ethics so that they did not deliberately spread false stories. They were far from perfect, but they helped prevent the widespread spread of misinformation. The internet has effectively removed the filtering feature from mainstream media, allowing anyone with a social media account – and a basic working knowledge of how viral social media posts are – to spread information very quickly. erroneous to a very wide audience.

The digital age has given rise to automated accounts called “bots”. A bot is “a software tool that performs specific actions on networked computers without the intervention of human users”. Political operatives with a moderate degree of technical sophistication can use bots to speed up the delivery of messages on social media. Additionally, social media platforms facilitate the use of micro-targeting: “the process of preparing and delivering personalized messages to voters or consumers.” In the summer of 2017, political activists in the UK created a bot to message Tinder, a dating app designed to attract new Labor supporters. “The bot accounts sent between thirty thousand and forty thousand messages in all, targeting eighteen to twenty-five year olds in constituencies where Labor candidates needed help.” In the elections that followed, “Labour won or successfully defended some of these targeted constituencies by just a few votes. Celebrating their victory on Twitter, campaign managers thanked… their team of bots.” There is no evidence in this case that the bots were spreading false information, but unethical political operatives can also use bots and microtargeting to quickly spread false messages through social media.

Over the past two decades, we have seen the growth of an entire industry of paid political consultants who have developed expertise in using social media to influence political outcomes. The Polish company mentioned earlier in this chapter is an example. Philip Howard, a leading expert on disinformation, says, “It’s safe to say that every country in the world has a local political consultancy firm that specializes in marketing political disinformation.” Political consultants work with data mining companies that have accumulated huge amounts of information about individuals by collecting data from various sources, including social media platforms, and aggregating this information into proprietary databases. The data mining industry “provides the information campaign managers need to make strategic decisions about who to target, where, when, with what message, and on what device and platform.”

Political consultancies use both bots and human-run “fake accounts” to spread messages through social media. (A “fake account” is a social media account operated by someone who assumes a false identity for the purpose of misleading other social media users as to the identity of the person operating the account.) They leverage data from the data mining industry and technical features of social media platforms to engage in very sophisticated micro-targeting, sending personalized messages to certain groups of voters to shape opinion public and/or influence political outcomes. “Social media algorithms allow campaign messaging to be continually tested and refined, so the most advanced behavioral science techniques can refine the message in time for those strategically crucial final days” before an important vote . Many of these messages are undoubtedly true, but there are several well-documented cases where paid political consultants have deliberately disseminated false information in the service of a political objective. For example, Howard documented the Vote Leave campaign’s strategic use of disinformation in the final weeks before Britain’s Brexit referendum.

It should be emphasized that we should not believe that misinformation erodes the foundations of our democratic institutions. Disinformation “does not necessarily succeed by changing mindsets but by sowing confusion, undermining trust in information and institutions, and eroding shared benchmarks”. For democracy to work effectively, we need shared benchmarks. An authoritarian government can require citizens to wear masks and practice social distancing during a pandemic by instilling fear that leads to obedience. In a democratic society, by contrast, governments must persuade a large majority of citizens that scientific evidence demonstrates that wearing masks and practicing social distancing saves lives. Unfortunately, misinformation spread on social media undermines trust in government and scientific authority. Without that trust, it becomes increasingly difficult for government leaders to build the consensus needed to formulate and implement effective policies to address pressing social issues, such as slowing the spread of a pandemic.

All products recommended by Engadget are selected by our editorial team, independent of our parent company. Some of our stories include affiliate links. If you purchase something through one of these links, we may earn an affiliate commission.

Share.

Comments are closed.