Crowdsourcing morality: How an app allows the Iraqi government to arrest ‘indecent’ influencers
Earlier this year, Iraq’s Ministry of the Interior launched an app called Ballegh to great fanfare. The government in Baghdad billed it as an opportunity for anyone in Iraq to help the government identify social media posts that were “offensive.” The problem is, the government hasn’t defined “offensive,” which has allowed Baghdad to crack down and arrest just about anyone after deeming their content “indecent.”
Digital and human rights activists say Ballegh’s very existence flies in the face of free speech provisions enshrined in Iraq’s post-Saddam Hussein constitution 20 years ago. In a written interview with Click Here, Ayad Radhi, the IT administrator at the Ministry of Interior in charge of the app, says the government is just protecting “family values.”
Ordinary Iraqis, for their part, appear to be embracing the effort. The Interior Ministry told Click Here that it has received more than 144,000 reports on the platform since it was introduced in January. Radhi laid out how it works. His comments have been edited and condensed for clarity.
CLICK HERE: How does it work after someone sends you a report?
AYAD RADHI: When a citizen makes a report on "Ballegh," a record in the database is created and marked as "new-case." A team of admins reviews the new cases and decides whether a case has the bare-minimum amount of information to pass it to the forensic department where the case is analyzed for what qualifies to be evidence, and if so it is pushed up the chain.
CH: And so how do you determine what is indecent or amoral?
AR: That is left to the committee — which has an officer, a judge, social researchers, community specialists — to decide. So it is never a single person who makes the decision. The committee is a central component in the chain, as the committee processes the reports daily and moves them up to the next level if they have incriminating evidence.
CH: You’re getting tens of thousands of reports. How do you decide the size of the team fielding these reports?
AR: The team is elastic. Staff members are added and removed on-demand in accordance with the number of unique daily reports. We have custom-built algorithms to aggregate the similar reports. When some incident/video/thing goes viral, we will have thousands of reports pointing to the same content. So, we have a couple of algorithms which basically group the similar reports so they can be processed as one.
CH: Iraq is the latest example of a country in the Middle East using technology to report social media activity. For example, it reminds us of Kollana Amin in Saudi Arabia. Do you think it is similar?
AR: That is an interesting observation. To be honest with you, we developed it based on our needs, so we didn't compare it with any other solutions. We just made a list of our needed features and went into building mode.
CH: Some people say Ballegh is allowing people to settle personal vendettas. How do you respond to that?
AR: Well, people can say many things but what matters is the real results. All the people who were arrested had hundreds (some thousands) of complaints against them due to their obscene/adult videos. Let me put it this way, the cases were clear-cut from the legal perspective. That is why not a single case was refuted in a court of law.
CH: Are you seeing any demographic trends among those using Ballegh to report social media users?
AR: We don't collect any information about the people who do the reporting since it’s anonymous. However, we are seeing a major interesting trend, that is people care so much about family ethics and values. The top reported content is usually a video that violates the core principles of the Iraqi family. Let me put it this way, people are fighting tooth and nail for their families.
In terms of the people who are getting reported, we see 20-to-30-year olds with no college degree, no formal job/employment.
CH: Do you have a Ballegh 2.0 in the works? Are there features you’d add, which you wish you had?
AR: Ballegh is being developed continuously, i.e. the admin panel is getting updated on a regular basis. We receive lots of "feature-requests" from our admin teams which aid them do their work more efficiently. We are also optimizing the similarity algorithms on a regular basis. So the way we see it, there is no 2.0 version as we are in the rolling-updates mode.
CH: We were able to access Ballegh in the United States. Is it accessible outside of Iraq?
AR: It was accessible but we got flooded with DDoS attacks so we limited it to Iraq only for the time being.
CH: If you could do one thing to improve the app, what would it be?
AR: We are so interested in what OpenAI’s GPT-4 has to offer. So we are looking into ways of integrating it with the admin panel to improve the similarity and pre-filtration process. The more we think about it, the more we see it as a must-have feature.
We also added a second type of reporting which is "corruption by a member of the security forces.” We are asking the public to report any corruption/extortion cases. We believe in having well-disciplined security forces and are relying on our people to keep us informed.
Dina Temple-Raston is the host and executive producer of the Click Here podcast as well as a senior correspondent at Recorded Future News. She previously served on NPR’s Investigations team focusing on breaking news stories and national security, technology, and social justice and hosted and created the award-winning Audible Podcast “What Were You Thinking.”
Sean Powers is a senior producer for the Click Here podcast. He came to the Recorded Future News from the Scripps Washington Bureau, where he was the lead producer of "Verified," an investigative podcast. Previously, he was in charge of podcasting at Georgia Public Broadcasting in Atlanta, where he helped launch and produced about a dozen shows.