‘The Internet Is a Crime Scene’
The way in which we operate online has a lot to do with good faith. We have good faith that people are posting things that are true, says Donovan. Our algorithms have good faith that a thing that says its news is news. Thats the kind of openness of the system that we built. And now weve realized that you only get a few good years with that, because bad people with bad intentions figure out how to use that system to their own ends.
In the months leading up to last weeks insurrection, pro-Trump conspiracy theories repeatedly found purchase among the presidents supporters. Some posited that the coronavirus pandemic was a hoax to hurt Trump. Or that mail-in ballots were a fraud being perpetrated to hurt Trump. Or that last summers protests against the police killings of Black Americans were secretly organized and funded by liberal billionaire George Soros to hurt Trump. And on and on.
It was repetitive, says Donovan. Over and over and over again, it had the same form, but the target didnt change.
For true believers, that drumbeat of misinformation deepened conviction and drew together a bunch of people who saw Trump as an embattled leader facing the biggest theft of the century, says Donovan.
Where does America go from here? How should we think about the internet when it so easily radicalizes people? And is there any way to use the internet to deradicalize them? For answers to all of that, POLITICO Magazine spoke with Donovan this week. A condensed transcript of that conversation follows, edited for length and clarity.
We are a week removed from the violent insurrection at the U.S. Capitol, which was largely the result of misinformation and disinformationabout both the outcome of the election and things like the QAnon conspiracy theory. You study and track this stuff. If you pull back, what is the state of misinformation right now?
The internet is a crime scene. It was cacophonous in terms of building this community of people who went to D.C., and now we see the cleanup of all of the online ephemera being removedfor good reason. But while that cleanup happens, the DOJ and FBI and journalists and researchers and civil society are doing everything they can to collect data.
Were collectively witnessing the aftermath of probably one of the biggest lies ever told in terms of the amount of people it reached and the effects that it had: the claims about election fraud, which then led to the event at the Capitol. What is different specifically about January 6 that we have to reckon with are the moments that led us there. Along the way, everybody knew that [pro-Trump attorneys] Rudy Giuliani and Sidney Powell and Lin Wood were conducting law in bad faith. They were doing everything they could online to make it seem as if they had legitimate court cases and that if not for the horrible courts and judges, we would have had a different outcome in the election. The repetition shows me there was a plan and that it was coordinated. Theres much more culpability to be spread around.
Youve described the insurrection as the result of networked conspiracism. What do you mean by that?
When we use the word networked, were trying to address the scale by which people are communicating with each other about a particular topic, the way in which technology brings people and ideas together into the same space, and then the durability of those ideas in those communities.
Conspiracies usually take the form of rumors that travel through communities. For instance, Brandi Collins-Dexter, a research fellow with our team [at Harvard], has talked about conspiracies in the Black community that were somewhat protectivethe idea that there was something bad in the water in Flint, which predated people getting evidence of it being a thing. Conspiracies can actually fulfill a function for communities, warning them about very powerful political and moneyed interests that may be harming them.
But in the case of QAnon, a networked conspiracy really drew together a bunch of people who saw Trump as an embattled leader facing the biggest theft of the century: the theft of the election. In the lead-up to that, you had iterations of different conspiracies circulating through different communities onlinesaying that the pandemic was a hoax to bring down Trump, and the vaccine was a hoax to bring down Trump, and mail-in ballots were a hoax to bring down Trump, and [Anthony] Fauci was a hoax to bring down Trump, and so on. It was repetitive. Over and over and over again, it had the same form, but the target didnt change.
Facebook and Twitter and YouTube did take some action to try to delete some QAnon content, especially as it morphed into a more militarized social movement during the reopen protests. But the damage had already been done.
Before 2020, major tech platforms rarely interjected to quell falsehoods or conspiracy-minded content. That changed notably when it came to Covid-19 and election misinformation. Why do you think it took so long, and has what theyve done been successful?
Theres no financial or political incentive to look for the evidence [of misinformation and conspiracies being spread]. It took these companies years to address the nature of the problem. There were numerous instances in which platform companies should have taken action. Several very public scandals about misinformation had to happenaround Brexit, around the Russian Internet Research Agency.
After Charlottesville, we saw a moment of exception, when platform companies decided they were going to take off [certain users and content] in what later became their policies on dangerous groups and dangerous individuals. Downstream, thats where you see the removal of the Proud Boys and Alex Jones.
And that leads me to think about the failure in this moment. The way media-manipulators and disinformers operate is that they tend to leapfrog across different platforms. Its to their benefit to have all of the platforms at their disposal. However, if they can use one or two of them effectively, they can get that information to circulate on other platforms.
The lack of guardrails that transcend the corporations that provide these services really led to this moment. And unfortunately, weve gotten to this situation because disinformation is an industry.
Describe that. What does the disinformation industry look like at this moment?
Its good money. And you can use it to wage an insurrection. Im thinking here about someone like Steve Bannon, who has always seen media as a war of position and is very effective at making sure that the disinformation campaigns that he designs stay in steady drumbeat in the more mainstream media ecosystem for months. Its no surprise, of course, that Bannon was behind some of the pieces of disinformation related to Dominion Voting Systems, as well as probably one of the biggest scientific scams, or cloaked-science operations, that weve seen in a long time: He was responsible for flying a post-doc researcher from Hong Kong to the United States and then helping her write and publish this thing called the Yan Report, which alleges that Covid-19 was a Chinese bioweapon, and in which the author claims she worked in a lab developing it. That staging takes an incredible amount of money. So, Bannon is working with an exiled Chinese billionaire [Guo Wengui] who runs G News [a pro-Trump and anti-Chinese Communist Party online outlet] and shares an interest in destabilizing any and all foreign policy. Bannon really shows that theres money to be made, as well as political gains.
When I think about this in terms of my research, I think about the true costs of misinformation. The openness and the scale, which used to be the virtues of platforms and the internet, have now been thoroughly appropriated and weaponized by disinformation-campaign operators.
Theres a concept Ive heard you refer to when talking about media manipulation: trading up the chain. Can you describe how that works?
Yeah, so thats actually from a book by Ryan Holiday called Trust Me, Im Lying. He used to design viral media campaigns, and noticed that if low-level bloggers covered, say, a movie he was marketing, that it would quickly trade up the chain, because people in online news organizations and cable news looked to those low-level blogs for interesting stories. In his book, he even talks about posing as outraged people who have seen a certain billboard or a movie trailer or this controversial adthen you can even make it seem like even more of a story.
The idea is simple: You create outrage about something, and then you try to get journalists to pay attention to it. We see that tactic happen often on the internet. On anonymous forums and message boards, its almost a game, where if there is a breaking news event, they will try to plant disinformationmisidentifying a mass shooter is one of the favorites in this worldand then try to get journalists to report the wrong thing.
We probably saw the most interesting case study of that mechanism when we saw an attempt at it fail with the Hunter Biden laptop story. [Trump allies] were going to plant it in the New York Post, which had very low standards of editorial verification, and it was very clear to everyone that there was something amissbe it the acquisition of the laptop, or the way in which it found itself to Rudy Giuliani. Nobody knew what other ephemera was on the computer, like recipes or grocery lists, just that there were crimes and sex tapes and drugs.
In that instance, trying to trade up the chain backfired. Everybody who works in [the disinformation-debunking sphere] knows that disinformers consistently go back to the same tactics. We were all ready for a hack-and-leak operation. That particular tactic really wasnt going to work the way that they had hoped.
You describe this cycle of media manipulation where people with powerwhether elected officials or those with resourcesspread disinformation about, for instance, the election being rigged. And then they point to the fact that their supporters believe that disinformation as evidence that this is something real people are worried about and needs to be taken seriously. How do you get out of that cycle? It seems like its a downward spiral.
It does, and its been feeling like that for several years now. At this stage, I think even calling it disinformation is doing a disservice, because as a researcher this is on a scale that I havent seen, and it touches every kind of media that we have.
Our media ecosystem is incredibly fragile and broken. The way in which we operate online has a lot to do with good faith. We have good faith that people are posting things that are true. We have good faith that people are communicating openly and honestly with one another. Our algorithms have good faith that a thing that says its news is news. Thats the kind of openness of the system that we built. And now weve realized that you only get a few good years with that, because bad people with bad intentions figure out how to use that system to their own ends. And they pay money to do it, profit off it and face no consequences for their actions.
Its telling how many interviews Ive done this week that center around platform companies taking away Trumps [social media] toys, as if thats the worst thing you could do to the leader of the U.S. government. And thats because we have very low faith in institutional accountability. We have very low faith in our governance structure to be able to remedy the poison that Trump brought to the Capitol. And thats what I worry about when I think about how we fix this, or how we move beyond it, or how we have peace again.
I place responsibility for what happened with our gatekeepers. That is, the politicians who were allowed to get away with this, who flew in a flank behind Trump in the lead-up to this and made it seem like there was going to be another outcome on January 6 if the crowd intervened. I place responsibility with them.
I dont want to say that its as if some groups of people have been mindlessly deluded into this, or that somehow technology performed hypnosis. These are people who believe the system is woefully brokenwhich a lot of us can agree with. But theyve also chosen who theyre going to believe. And in that case, theyre not operating on facts; theyre operating on belief. And they believe Trump was chosen by God, and that he called upon them to go to the Capitol and save him.
And if you listen, honestly, its not deep. Its so shallow. Its so in-your-face. Its right there when you look at the Proud Boys the night before [the insurrection] chanting, F–k Antifa and 1776. Youve just got to believe that thats what they believe. You cant use your imagination and assume, Well, they must not really believe that or They must not know.
If the internet can radicalize people, can we also de-radicalize people online?
Theres a mass phenomenon in which you can bring people into a worldview and have them see it your way. When we study this, we look for signs along the way that a YouTuber or podcaster is trying to radicalize people. How do they talk about gender? Are they talking about women denying you their bodies, rather than women being thinking beingsand, hey, youre not that cool? What are they saying about people of color? Are they telling you that Black people are stealing things from you, that theyre not doing as much in this world as you are, that they are not as deserving of college scholarships? How are they talking about immigrants? Are they saying immigrants are siphoning off of the goodwill of white Americans and that theyre better off in another country where they cant get the benefits of living in America?
As you look at these red pills, you realize that they all they almost always come in a package. And the hardest one is actually the oldest: Its the Jews. But very rarely do we hear a radicalized podcaster start with the Jews, because we have a society where people hear that and right away it raises flags: Did I just hear you say … ? So, they start [more subtly] and move towards the Jewish question. They pose this idea that actually everyone you voted for has nothing to do with the governance of the country, and that its really this small group of people controlling the government, controlling the media, controlling entertainment. And you can see it laid out on some of the anonymous message boards, the memes that just freely circulate: Here are all the Jewish people that work at CNN.
If youre living in those worlds where you see that stuff day-in and day-out, youre no longer shocked by seeing swastikas. Youre no longer shocked by hearing that some woman was harassed and assaulted. Youre almost excited by it. When you see people of color assaulted by police, you get excited about that. That, to you, is redemption for a world that has wronged you.
When I think about deradicalizing someone or bringing someone back from the brink, that is a process of love and understanding and community-building. I dont think its necessarily just like, Oh, somewhere along the way, this individual must have been broken, and then this filled that void. Were all broken in some way. All of us have had something that weve had to overcome. Its really about those moments when someone could have led you out of that.
But the technology we use that connects people, we failed in the design of it. [Instead of getting led out,] people get endless streams of Nazi propaganda under the guise of history, or endless streams of podcasts that are taking in-the-moment news about Jeffrey Epstein and the horrible things that he did, and tying that back to, you know, Jewish people or whatever. Technology gives you easy access to these things. Its all laid out for you.
When we do this research in our lab, we have an adage: Let the algorithms do the work. If I want to discover white-supremacist content, I find a little bit of it and then let the algorithms do the work. I click every recommendation. I follow every suggested follow. And within an hour, I have a corpus of data to work with.
Until the moment that is no longer true, were going to keep doing this research and were going to keep banging on the doors of platform companies saying that somethings really broken, and weve got to fix it.