Twitter Files Prove That Russia's Disinformation Operations Succeeded
Disinformation campaigns aren't about convincing you of lies but creating a permission structure to believe what is convenient.
I’m sure that you’ve seen the story about Twitter files. The flagship story of the first dump of the files was Hunter Biden’s laptop. At the time when the story broke, virtually anyone who didn’t want Donald Trump to be re-elected treated it as a fake Russian operation. Reporters ignored the story, commentators dismissed it, and a long list intelligence community veterans said with certainty that it was another Russian disinformation operation.
That Rudy Giuliani somehow got his hand on a laptop that belonged to the former Vice President’s son that was sent to be fixed by a blind guy in Connecticut and became a story weeks before the Election Day at a very pro-Trump newspaper which the writer of the story refused to put her name on it made it quite obvious that it made-up, especially given how around the same time in 2016 we had gone through the same Russian [dis]information1 operation. Except that it wasn’t.
Thomas Rid, an information security expert at Johns Hopkins University’s School of Advanced International Studies (the greatest graduate school in the history of the universe, of course, in my non-biased fact-opinion, which I happen to be an alumnus of), writes in his 2020 book, Active Measures:
At-scale disinformation campaigns are attacks against a liberal epistemic order, or a political system that places its trust in essential custodians of factual authority. These institutions—law enforcement and the criminal justice system, public administration, empirical science, investigative journalism, democratically controlled intelligence agencies—prize facts over feelings, evidence over emotion, observations over opinion. They embody an open epistemic order, which enables an open and liberal political order; one cannot exist without the other. A peaceful transition of power after a contested vote, for example, requires trusting an election’s setup, infrastructure, counting procedures, and press coverage, all in a moment of high uncertainty and political fragility. Active measures erode that order. But they do so slowly, subtly, like ice melting. This slowness makes disinformation that much more insidious, because when the authority of evidence is eroded, emotions fill the gap. As distinguishing between facts and non-facts becomes harder, distinguishing between friend and foe becomes easier. The line between fact and lie is a continuation of the line between peace and war, domestically as well as internationally.
A key objective of [dis]information operations is to make you suspicious of any information, especially inconvenient news. For Biden supporters, the Hunter laptop story was inconvenient, and the 2016 Russian operation had created the permission structure to be suspicious of inconvenient news. So it became very easy to dismiss what we didn’t like to hear as fake news.