Eliot Higgins

Bellingcat Founder Eliot Higgins on Finding Truth in a World of Disinformation

In 2013, Eliot Higgins was an unemployed finance and administration worker blogging about the Syrian civil war from his home in Leicester, about 100 miles north of London. Since then, Higgins has turned his hobby into a full-fledged investigative journalism operation, with an office in the Netherlands and 18 full-time employees.

His website, Bellingcat, has made a name for itself by uncovering some of the most important stories on the Syrian conflict and other international disputes, including the downing of Malaysia Airlines Flight 17 in 2014 and the poisoning of Sergei and Yulia Skripal in 2018.

Higgins and his colleagues have been called pioneers of modern open-source investigation, which uses publicly available sources—such as YouTube videos, social media accounts, and news items—to verify information around major events and uncover new details that would typically go unnoticed. Part of Higgins’ mission is to expand the use of these techniques globally, and inspire people to get involved with open-source investigations. Bellingcat has contributors in more than 20 countries.

Higgins talked to The Record via Zoom from his home in Leicester about disinformation, the upcoming presidential election, and the rapid spread of conspiracy theories. The conversation below has been lightly edited for length and clarity.

The Record: When you started Bellingcat in 2014 did you have any idea that misinformation and disinformation would be such a big problem today?

Eliot Higgins: No. When I started, I really had no concept of how bad it would get, especially around the 2016 election and how it's progressed since then. I think maybe some of the driving factors were definitely there—there's often a focus when you talk about misinformation on Russia and what Russia is up to, but I think also the media culture in the U.S., for example, has led to what we're seeing now in 2020.

I was just watching a clip today where a journalist from OANN [the One America News Network] was asking Donald Trump the most incredibly loaded question, basically giving him the answer, and I said even Russia Today journalists wouldn't be this blatant about what is basically this propaganda. So what you've really seen develop over the last several years, and this is something that's been going on for a long time but it's really come into focus now, is this alternative right-wing media ecosystem in the U.S., which is probably more influential in the elections than any Russian disinformation campaign could be. But you know back in 2014 I was just trying to make a place where people could do open source investigations and come together to do that. 

TR: What about the more established news outlets. Do you think they're getting better at verifying information and presenting the truth?

EH: To an extent, but there are mainstream media organizations that aren't actually too interested in verifying stuff anyway. There’s the MailOnline, a very big English language news website that publishes very thinly-sourced articles, all of which are basically garbage, but they just want to get as many clicks on their website as possible so they don't really care about being accurate. I think of an example where there was a video that was shared online claiming to be of an ISIS sex slave market. And the Mail did a piece on it based on one source, which was a Facebook group, basically. ISIS isn't going to sue the Daily Mail for misrepresenting what ISIS is doing. ISIS isn't going to call their lawyers and say: “Hey, can you talk to the Mail about this article they published about us?” So you do see even mainstream media sites just using badly sourced stories that are completely false because it gets them the clicks. 

TR: Do you think that ultimately what Russia did is take down the value of facts in America?

EH: No, I would say probably the right-wing media ecosystem and cultural changes that started decades ago has come to this point where U.S. media in particular has been split into two parts. People have a whole range of options now in the right-wing media ecosystem—Breitbart, Fox News, and all these right-wing websites where they have a choice of a wide range of media to consume without actually ever having to consume the other side. Non-Republican media is portrayed as basically the enemy, so why would they consume it when it is portrayed as being deceptive and lying and having an agenda against the person they support. And I think this is something that is far more ingrained in U.S. culture than it is in other parts of the world. 

There is still a slight element of that in Europe. In the U.K., the press in particular are very split between party lines at the moment, with the Mail very much behind the conservative party and Boris Johnson. So it exists in other places, but it's become hyper-mobilized in a way in the U.S., and I think always when we're coming to the discussions of disinformation and misinformation, often you find this focus on the Russian interference in the 2016 election. I’m sure it had some impact when we had stuff like the WikiLeaks emails being published. But a bunch of Russian troll factories putting stuff on Facebook I don't think is going to change the difference between Trump and Clinton being elected back in 2016. What made the difference was how the media coverage of both sides came about.

eliot-higgins-1024x683.jpg

Eliot Higgins

TR: Do you think Russia and other operators are getting better at disinformation?

EH: I wouldn't say they're getting better at it—they're just getting better at producing more of it. They have had issues because Facebook and other social media companies have been cracking down on this behavior. That's impacted how they can operate. But we see in other parts of the world disinformation campaigns and influence campaigns being run in a similar fashion. Memes being produced on a whole range of different issues that are used to target certain groups, and not just in big countries like China but in smaller nations. We've also seen businessmen, political figures, and activists using it for their own purposes as well. We've seen campaigns originating from all kinds of fringe groups because it's quite easy to get a botnet to post a load of tweets about stuff. The actual impact of that is usually very little, because people don't follow these pop-up networks—why would you? So they want to get something to trend, and very rarely does it actually happen because Twitter is getting better at picking up on these campaigns, but it's still happening. There are still various influence networks on Facebook and Twitter being identified and shut down all the time. 

TR: You've been very successful at tracking a lot of these operators and shedding light on them. Do you worry about physical threats to yourself and the organization because of your work on the GRU and other operators?

EH: To a certain extent, yeah. It's gotten to the point now where I’ve been visited by the police to talk about my safety, and often doing events now they have to put on extra security. It’s not just because of threats from Russia but also these alternative media ecosystems that really hate Bellingcat, because they believe that we're part of the CIA or MI5 or whichever three-letter agency they prefer on that day. And those are the kinds of people who tend to be, in my mind, more worrisome. If you're dealing with the GRU or the FSB from Russia then you expect they won't do certain things. But some crazy person off the internet who's decided that you're part of the Illuminati, they could do anything. We also think a lot about cybersecurity and we take steps to be as secure as we can be. 

I’m also targeted frequently by disinformation from Russian sources and this alternative media ecosystem that exists around the topics we investigate. Russia Today has published a vast number of articles about me—they get every single opportunity they can to find some criticism with me. When I tweet, I know if I say something that's controversial there's a good chance there's going to be an article about it and it makes me police what I’m posting online quite a bit. 

"I wouldn't say [Russia] is getting better at [disinformation]—they're just getting better at producing more of it."Eliot Higgins, founder of Bellingcat

TR: What are some of the biggest challenges that you face with open source investigations?

EH: The challenge is finding the stuff in the first place—just having the time and the resources to look into things. We could investigate a million different topics but we don't have millions to do it with, so we have to be very narrow in what we're trying to do. We're trying to spread the use of open source investigation by leading by example—training other people to do it, building a community of volunteers. And more recently we've been looking at working on areas around justice and accountability, so using our experience with Syria we started investigating Saudi airstrikes in Yemen and developed a process of investigation with the Global Legal Action Network which is a network of lawyers to basically produce evidence from open sources that then could be submitted in courts. We've submitted that for a U.K. government inquiry into arms exports into Yemen. Basically the U.K. government moved the goal posts of what was acceptable until the evidence was submitted, but we also submitted to other cases as well. So in a way, the challenges are not so much in the work but how the work is being applied and what we can do to do that, because there's a lot of interest now especially in the justice and accountability element of open source investigation. 

TR: How do you explain the appeal of rapidly-growing conspiracy theories like QAnon?

EH: Part of it is because the internet is very good at making you find things that you like and communities that believe the same thing as you. And if you're part of one of those communities and they become too extreme for you or you become too extreme for them, you find a community that fits what you like—you're filtered towards whatever is the most interesting thing for you. The problem is, some people are inclined to more extreme and strange beliefs. It can be people who believe that bleach is a medication that can be used to cure autism, you have people who believe that all the chemical weapons attacks in Syria are fake. And the more extreme your beliefs are, you will always find someone who is similar to you. You'll find those communities and those communities come together and they'll have their own bloggers, their own personalities, and create these separate media ecosystems. 

"It's gotten to the point now where I’ve been visited by the police to talk about my safety, and often doing events now they have to put on extra security."Eliot Higgins, founder of Bellingcat

TR: Deep fakes have become a much bigger issue since Bellingcat was founded. Do you think they’ll be easy to debunk, or are you concerned that they’ll cause serious damage?

EH: I think there are two elements to this. One is how a deep fake is shared on social media and how quickly it propagates. The second is evidence required to debunk it. If we saw a video of Trump saying that he loved Joe Biden, or something like that, it would be a clip from a longer piece of video, which we would have to find. You would look at the way his body was moving to see if you could match that to any particular piece of footage, because it's usually the face that's altered. It could take hours and hours and hours— we can do that at the moment, but if that video has then been tweeted 10 million times… Once we figure out it's fake, it's not going to really help too much if the message is already out there and it's already had an impact. 

There's a lot of companies now that are looking at how to identify deep fakes faster. I think the main problem we've got at the moment with deep fakes are these A.I.-generated faces keep on turning up on fake social media profiles. Because those are often slightly wrong in various ways, like the backgrounds look weird or they've got teeth in a slightly off position or the lights reflecting off the eyeballs are coming from two different sources, those you can actually identify and that's a clue that an account is fake. In fact, Jacob Wohl, the right-wing idiot who is very popular, used a fake social media account with an image in his last little press conference, making a wild claim about some politician or another. But because of the problems with the deep fake, we could immediately say this is a fake account. We followed it back to it being his account for another fake scam he did—it was basically just the same Twitter account that he kept on renaming and using for a series of scams. So in a way, anything that allows us to identify that something is wrong will make us look at it more and dig into it. We're looking for stuff that's untrue, stuff that's wrong, stuff that's interesting—and the most interesting thing is someone lying and getting caught.

TR: You have a book coming out—what made you decide to write it, and what will the focus be?

EH: Well, I’ve been asked to write books over the years about open source investigation and stuff like that. In 2019 it seemed like a lot had happened—MH17, the Skripal poisonings—big examples of how open source investigations were being used for justice and accountability. And we were doing more stuff at Bellingcat that was not front-facing stuff—we're doing things in the background that isn't really well publicized. So I wanted to write about that, and explain my story of where I came from—starting this as a hobby on my sofa at my home in Leicester to becoming this hot thing that's had a major impact that's exposing Russian spies, showing who shot down MH17, that's being used by the International Criminal Court and a whole range of organizations. So, for me it was really just a chance to lay this all out and explain it and hopefully do it in an entertaining fashion... With the book I hope to inspire more people to get involved in this work.

Get more insights with the
Recorded Future
Intelligence Cloud.
Learn more.
No previous article
No new articles

Adam Janofsky

Adam Janofsky

is the founding editor-in-chief of The Record from Recorded Future News. He previously was the cybersecurity and privacy reporter for Protocol, and prior to that covered cybersecurity, AI, and other emerging technology for The Wall Street Journal.