Subscribe here: Apple Podcasts | Spotify | YouTube
In this episode of Galaxy Brain, Charlie Warzel sits down with Eliot Higgins, founder of the open-source investigative collective Bellingcat, to examine how our public sphere slid from healthy debate into what Higgins calls “disordered discourse.” Higgins is an early internet native who taught himself geolocation during the Arab Spring and later built Bellingcat’s global community. He has spent the past decade exposing war crimes and online manipulation with publicly available data. Higgins has recently come up with a framework to help understand our information crisis: Democracies function only when we can verify truth, deliberate over what matters, and hold power to account. All three are faltering, he argues.
In this conversation, Warzel and Higgins trace the incentives that broke the feed: how algorithms reward outrage, how “bespoke realities” form, why counterpublics can devolve into virtual cults, and what “simulated” accountability looks like in practice. They revisit Higgins’s path from early web forums to Bellingcat, look at the MAGA coalition as a patchwork of disordered counterpublics, and debate whether America is trapped in a simulated democracy. Higgins offers a clear diagnosis—and a plan for how we might begin to claw back a shared reality.
The following is a transcript of the episode:
Charlie Warzel: Hey, everybody. It’s Charlie. And before we get to today’s episode, I had requests for all of you listeners. We’re working on a story about screen time. And when we tend to talk about screen time, often the conversation will be focused on younger people. We’re worried that they’re getting too much screen time, or they’ve been radicalized by what they see on their devices, or that they don’t seem to understand how they’re being manipulated.
But I’ve gotten a lot of anecdotal reporting over the last few years that the problem is similar, if not worse, on the other side of the age spectrum. And so we wanna do a story about a different generation’s relationship to this technology. We’d really love to hear from you. Whether you are somebody who is having some of these problems, or you feel your relationship to your device has become a bit problematic or lopsided, we’d really like to hear from you. Tell us your age and why you feel you have an unhealthy relationship with your device. If you’ve noticed this with a family member, we also want to hear from you. So if you could send us a brief voice memo—about a minute, no longer—we’d absolutely love that. More than anything, we want you to emphasize and describe what you’re seeing and what you’re feeling about your loved one’s screen time or your own, and we want you to express whatever honest amount of concern you have.
Please send that voice memo to cwarzel@theatlantic.com. It’s cwarzel@theatlantic.com. Thank you so much, and here’s today’s episode.
Eliot Higgins: In terms of deliberation, you have these, you know, “20 X versus one Y” videos, which kind of do this before.
Warzel: Oh, you mean the Jubilee videos?
Higgins: Yeah, those. I despise those. I think they’re just a strong example of the kind of hollow performance of democracy—that no one’s there to learn from each other, to come to a shared understanding.
They’re there for clips to get attention on social media, and that’s, for me, the bottom line of those videos. No one’s going there to have their minds changed. It’s not designed around that. It’s designed around capturing the algorithm, and I think it’s, you know, bad for democracy and pathetic as well.
[Music]
Warzel: I’m Charlie Warzel, and welcome to Galaxy Brain. Today we’re gonna talk about discourse, and not the shorthand on the internet for a viral outrage, right? When we talk about discourse, we’re talking about, like, Cracker Barrel’s logo changing and people being up in arms about it, or the latest political infighting. This version of discourse that we’re gonna talk about in this episode is much more substantial, and I think that the stakes are far higher. Discourse—as we define it here—is, essentially, just our ability to talk to each other, to find things out about the world, to debate those ideas and establish ground truths and reject the things that we don’t like.
It is our collective sense-making process. It’s how we do science; it’s how we develop laws. It’s the backbone of a functional and healthy society, and nowadays when we talk about discourse, we often just refer to it as “The discourse is bad.” But there are all kinds of discourse, right? There’s a healthy, functional discourse where there are elites that are held to account, where we can debate things, where institutions and people act in good faith and are benevolent. And there’s a version of discourse that is kind of hollow, right? Where there are good actors and bad actors, and we kind of just limp along there, even though there are a lot of inequities in the system. And then there’s what my guest calls a “disordered discourse,” where democracy is almost simulated. You have people who get into power, and they wield it by imposing their views on the world and making it so they can never really be held to account, right? This is something that I think a lot about today. And now with the Jeffrey Epstein investigation, right? You have this trove of emails that people are seeing, where you have elites talking behind closed doors and operating with impunity—because they don’t feel like they’re ever gonna be held to account for the amoral or immoral things that they have done.
And when that gets found out in the world, people get really angry, right? But in this disordered discourse, the reason why my guest says it’s simulated is because when people try to push back against that—when people do try to hold leaders to account—nothing functionally happens, or not enough happens. And so you get this incredible frustration. And when people feel like their democratic participation isn’t rewarded, they start to tune out or drop out of the system altogether. And it’s very, very dangerous. My guest today is Eliot Higgins. He’s an investigative journalist who founded the open-source company called Bellingcat.
They produce journalistic investigations using all kinds of publicly available data online. And Eliot is somebody who is a true internet native and really understands—and has gotten into the weeds of—all the different online manipulators, nefarious bad actors. And knows these platforms and systems inside and out. And so he’s the perfect person to talk about this. Not just because he has the experience, but also because he’s developed a framework around disordered discourse. And in it, he has this idea that there’s basically three conditions that allow societies to function, right. You have to be able to establish truth, debate what matters, and hold the powerful to account.
And if you think about those three pillars right now … doesn’t really seem like we’re doing a great job on a lot of those. You know, it is really harder than ever to establish truth these days. Debating what matters is happening all the time, but is happening in a very chaotic way, right? We’ve outsourced a lot of these conversations to these tech platforms that are not neutral—that constantly manipulate us, that drive us to be the worst versions of ourself, that amplify outrage.
We are operating in what the researcher Renée DiResta calls these “bespoke realities.” And so it leads to a discourse that is so disordered that it really threatens democratic collapse. And so, Eliot Higgins is going to walk us through this framework and try to ground us a little bit, describe the temperature of the water that we are all swimming in all the time, and help us try to figure out, if at all, how we can claw it back. So here’s Eliot Higgins.
Eliot, welcome to Galaxy Brain.
Higgins: Thanks for having me on.
Warzel: Yeah, absolutely. I wanted to start with your background, and specifically something that you posted on Twitter back when it was Twitter, where you talked about how the research work that you did that turned into Bellingcat, It started with, and I’m gonna quote you here, “me arguing with people on the Guardian Live, Middle Live blog comments; posting way too much on the Something Awful forums. During those arguments in 2011, there were videos shared from Libya, and arguments about their authenticity. That’s when I figured out you could use satellite imagery to figure out where these videos were filmed, stumbling into geolocation.”

The Atlantic

Raw Story
AlterNet
Reuters US Domestic
People Food
The Daily Beast
The Babylon Bee