© 2024 KLCC

KLCC
136 W 8th Ave
Eugene OR 97401
541-463-6000
klcc@klcc.org

Contact Us

FCC Applications
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

'Disinformation' Is The Word Of The Year — And A Sign Of What's To Come

<em>Columbia Journalism Review</em> set up a misinformation newsstand in Manhattan in October 2018, in an effort to educate news consumers about the dangers of disinformation in the lead-up to the U.S. midterm elections.
Angela Weiss
/
AFP via Getty Images
Columbia Journalism Review set up a misinformation newsstand in Manhattan in October 2018, in an effort to educate news consumers about the dangers of disinformation in the lead-up to the U.S. midterm elections.

As always, this year's word of the year candidates came from all over. There were the viral memes like "OK, boomer" and "weird flex, but OK," but they won't endure any longer than earlier years' candidates like FOMO and "manbun." "Quid pro quo" had a moment, but the jury's still out on that one. And a surge in dictionary lookups led Merriam-Webster to pick nonbinary "they."

My choice of "disinformation" needs some explaining. It isn't a new word — just one of the family of names we give to the malignancies that contaminate the public discourse, along with "propaganda," and in particular "misinformation" and "fake news." Each of those last two was chosen as word of the year by some dictionary or organization in 2017.

But over the past couple of years "disinformation" has been on a tear — it's 10 times as common in media headlines as it was five years ago, to the point where it's nudged its siblings aside. That rise suggests a basic shift in focus: What most troubles us now isn't just the plague of deceptive information on the Internet, but the organized campaigns that are spreading the infection.

Most of those headlines concerned the Russians. There was their weaponization of social media during the 2016 elections, which The New York Times called "the Pearl Harbor of the social media age," and the fears it will be repeated next year. There were also the stories about their interference in recent elections in the U.K., Italy and other nations. And most recently, there was the Russians' success in planting the conspiracy theory that it was Ukraine rather than Russia that interfered in our 2016 elections, despite it being debunked by U.S. intelligence agencies.

Disinformation is as old as human conflict — in the fifth century B.C.E., the great Chinese military theoretician Sun Tzu wrote that all warfare is based on deception. But the Russians can take credit for inventing the word itself. The term "dezinformatsiya" was reputedly coined by no less than Josef Stalin in the 1920s as the name of the section of the KGB tasked with deceiving enemies and influencing public opinion. Over the decades, that unit disseminated rumors by means of forgeries, moles, front organizations, fake defections and sympathetic fellow travelers, which, by the way, is another term that was translated from Russian. The Soviets put out that Pope Pius XII was a Nazi sympathizer and that the CIA had assassinated John F. Kennedy and invented the AIDS virus.

"Dezinformatsiya" was anglicized to "disinformation" during the Cold War era and extended to Western intelligence operations. The characters in John le Carré's spy novelsare always talking about planting disinformation to deceive the KGB, using the same clandestine techniques the Soviets did. But the advent of social media created a new field of play and a new panoply of tools for diffusing and amplifying disinformation: trolls and troll farms, bots, hacked accounts and microtargeting.

The Russians weren't the only ones to see the possibilities. In a recent report called the Global Disinformation Order, the Oxford Internet Institute identified organized social media campaigns in 70 nations. Authoritarian regimes use social media domestically to discredit political opponents. A dozen or so nations, such as Russia, China, Iran, Pakistan and Saudi Arabia, use it to influence opinion in foreign nations.

Dictionaries typically define "disinformation" as the dissemination of deliberately false information, and modern disinformation campaigns all make use of the mendacious techniques we associate with the Orwellian propaganda of the totalitarian states of the last century. They generate a deluge of deceptive narratives that some describe as the "firehose of falsehoods" concocted to glorify a leader or a cause or to malign their enemies. But these campaigns are not all lies. They're also aimed at sharpening tribal divisions and sowing confusion or apathy, and a lot of their effort goes to building out networks of followers. And for those purposes, a true report or even a benign cat photo can sometimes be just as effective as a blatant falsehood. You have to win friends to influence people.

In fact, as Clemson University researchers Darren Linvill and Patrick Warren point out, a lot of the disinformation produced by the Russians is just spin, and they have taken their tactics from modern public relations and advertising — as Linvill and Warren put it, they're less like Boris and Natasha than like Don Draper.

An ad man like Don Draper would have recognized the classic marketing tactic that led the Russians to fabricate the rumors of Ukraine's interference in the 2016 election. You undermine a competitor's product by creating a counternarrative to sow fear, uncertainty or doubt — what marketers call the FUD factor.

But Draper would have marveled at how porous our online discourse was — how easy it was to inject an implausible rumor into its bloodstream. And he would have been astonished at how quickly the rumor would find receptive hosts in public life. Stalin, maybe not so much.

Tech companies are trying to purge the brigades of foreign trolls and bots, but experts say that's an endless game of whack-a-mole. And they predict that by 2020 the majority of false content will generated by Americans imitating the Russians' tactics, often with the help of PR companies who specialize in such campaigns. And the tech companies balk at removing domestic content just because it's false, pleading free expression.

Call it disinformation, or call it computational propaganda or cyber-enabled information warfare, as some have done: The struggle against it is our new forever war, and our vocabulary is having to adapt to it.

Copyright 2021 Fresh Air. To see more, visit Fresh Air.

Geoff Nunberg is the linguist contributor on NPR's Fresh Air with Terry Gross.