This blog is part of a series of articles written by Georgetown University students, published as part of Counterpart’s Next Generation in Thought Leadership initiative. The opinions expressed in this article are the author’s and do not express the views or opinions of Counterpart International.
Before I went to bed a few nights ago I was scrolling through my Instagram like every other 20-something-year-old with insomnia. Just as I was about to doze off, I saw a post by my former high school track coach bragging about her account being suspended. The image was captioned, “My Badge of Honor,” and accented with an American flag and three hearts in the colors of red, white, and blue.
So, of course, I clicked on her account.
In her bio was a typical collection of personal descriptors with emojis, like “Pilates Instructor” or “Personal trainer.” The first descriptor she used, however: Truth Seeker, caught my attention. I began scrolling through her page and quickly noted that it was filled with conspiracy theories and propaganda videos attacking Democrats, right-wing memes, and “informational” videos about secret societies or child trafficking. Scrolling back about a year and hidden behind all the propaganda was a smattering of personal posts advertising her business or celebrating her family.
I was horrified.
How did my former track coach, known for being an encouraging, hard-working, intelligent, and tough woman, become another soapbox for the worst of the internet? Well, when social media companies, media organizations, and governing bodies have not made a strong enough effort to curb the spread of misinformation and violent conspiracy theories that propagate the internet and social media platforms, it’s easy to understand.
Ok, how did this happen?
I saw this misinformation on my track coach’s Instagram. In posts where she tagged people, I went and viewed their pages; they posted about similar conspiracy theories as she had.
Conspiracy theories and propaganda videos are not new to America. Persuading Americans with propaganda dates back to World War 1 and conspiracy theories gained steam around World War 2. With the advent of social media, people can now share anything they think or write or read (or didn’t read) to the whole world.
Communicating to the internet-connected world with the tap of a phone screen is new. Misinformation – fake news – spreads like wildfire on social media and is often more likely to elicit engagement. Compounding this spread is that oftentimes, people who start to engage with this misinformation find themselves in an “echo chamber” as one study described it. Users will follow a narrative, sometimes conspiratorial, that aligns in some way with their worldview and follow others or join groups that have similar beliefs which creates greater polarization. These ‘truth seekers’ are eager to share their discoveries and amplify it, believing it to be true.
When, however, these stories are created intentionally by a writer who knows they are fake, and are then spread deliberately and maliciously with either a political or financial motive, it becomes disinformation, or propaganda.
So where do these articles come from?
For one, places like Veles, Macedonia, home to over 100 pro-Trump websites. The owners of these sites search the internet for any pro-Trump article, outlandish or not. Then, they post it on their websites and in Facebook groups with one of their many fake profiles. From there, these stories, or similar ones, are shared and clicked on hundreds of millions of times.
There is also a concern that a wealth of disinformation articles are coming from Russian sponsored websites. The State Department released a report in August of 2020 that details what they call a Russian “propaganda ecosystem,” which is “a collection of official, proxy, and unattributed communication channels and platforms that Russia uses to create and amplify false narratives.” While researchers have debated how much these articles swayed the vote, it’s hard to deny the influence it has on Americans. According to a Pew Research poll in 2016, 62% of American adults received some form of their news from social media with two-thirds of Facebook users accessing news there.
So once again, social media is the problem?
No, disinformation is not limited to social media. Looking at Fox News, for example, a study done by Media Matters analyzed news stories on Fox News programming for the first four months of 2019; they found that Fox News propagated some type of disinformation every single day. This becomes more problematic when some of the disinformation and lies tie back to white supremacy. Shows like “Fox and Friends” or “Tucker Carlson Tonight” have been known to make outrageous claims, but when the most popular cable news network in America is defending people like Alex Jones or QAnon, its an escalation that can lead to acts of violence.
But is it really that big of a deal?
Websites like Breitbart News and Alex Jones’s InfoWars go even further than traditional media. They spread hate and violent theories and can be breeding grounds for violent extremism. Let’s look at two examples: In October of 2016, a Twitter account posts, falsely, that the NYPD discovered a pedophile ring linked to the Democratic party. This lie gets picked up by conspiracy theorists and the false story spreads around social media and in the process gets associated with a restaurant in Washington D.C. Fake news websites, like InfoWars, then start talking about it. At some point a man named Welch came across the story; in December, he goes to the restaurant and opens fire. Why? To save the children that were supposedly being held as sex slaves in the basement. This would come to be known as Pizzagate.
Our second example is QAnon; a group that operates within the misinformation world. The group is known for spewing anti-Semitic, racist, and fringe claims, including perpetuating stories like Pizzagate. Donald Trump has magnified their messaging and House of Representatives member Marjorie Taylor Greene is known to advocate their theories. Unsurprisingly, self-proclaimed QAnon members and their messaging have been connected with the Capitol Riots of January 6th.
Of all the conspiracies my coach posted, she mentioned protecting children from trafficking the most. She even has an Instagram Highlight dedicated to a child trafficking post. When you click on the Highlight, it brings you to a post that is a screenshot from Facebook. At the bottom, of the caption is “#save0urchildren.” A hashtag QAnon uses to spread their ideologies.
Can we stop this?
Not entirely, but we can take steps to diminish the influence of misinformation and disinformation. For starters, we can teach ourselves, our friends, and our family better social media literacy. As this podcast suggests, we need to read, listen, and watch the information we receive critically before deciding to share it. Not everyone has the tools to do this, however. A way to alleviate this problem would be for technology or library classes to teach lessons on media literacy in school. Alternatively, public libraries could offer a “survival guide to fake news” as this library did in Seattle. Secondly, we should check the sources and the information’s support to make sure it’s legitimate.
Finally, it’s up to social media and traditional media to take the spread of misinformation and disinformation more seriously. Recently, Facebook has come under fire for not doing more to stop the spread of misinformation. Both citizens and the US government need to hold social media companies accountable. When the spread of disinformation leads to people becoming radicalized though, it falls on institutions to act.
The Brookings Institute put out a piece that details how to counter extremist groups where they recommend the need for policy responses that “include multifaceted law-enforcement actions; steps to counter growing political capital, recruitment, and broader political polarization; and reintegration efforts.” However, I would like to add one aspect that may not be considered: including women in the conversation. Historically, women are a second thought when governments try to counter violent extremism even though they have participated in violent extremist movements for most of American history. Yet, too often women are seen as caretakers, too delicate to participate or help promote violent groups. Not considering women in the narrative leaves ample opportunity for women to slip into extremism, especially with groups like QAnon, where they play an immense part in disseminating their theories online.
My high school track coach doesn’t appear to have acted on any of these conspiracies or joined an extremist group yet. From what I have seen, she has taken up the role of spreading a wide swath of right-wing messages. But every week she seems to post more and more. Will her radicalization be ignored because she’s a high school coach, a caretaker of children? A woman? How long before the wrong person reaches out or she sees that one post that makes her want to take the next step towards extremism?