Instagram appealing $400 million fine from Ireland data privacy org over GDPR violations
Jonathan Greig September 6, 2022

Instagram appealing $400 million fine from Ireland data privacy org over GDPR violations

Instagram appealing $400 million fine from Ireland data privacy org over GDPR violations

Instagram said it will be appealing a $400 million fine issued by Ireland’s data privacy watchdog over accusations that the social media giant violated the General Data Protection Regulation (GDPR) by allowing children as young as 13 to operate business accounts. 

A spokesperson for Meta, Instagram’s parent company, disputed the way the fine was calculated, arguing that it “is not in accordance with the text of the GDPR,” resulting in a fine that is significantly larger than other GDPR-related fines. 

Ireland’s Data Protection Commission (DPC) said it adopted its final decision on Friday and will publish the full details of the decision next week, so it is unclear why Meta believes the fine was incorrectly calculated. 

DPC Deputy Commissioner Graham Doyle told The Record that the €405 million fine was the result of an inquiry that began on September 21, 2020 thanks to a tip from a third party and analysis done by DPC researchers. 

The inquiry centered on two different issues with how Instagram processed data in Ireland. Users were required to publish phone numbers and/or email address for business accounts run by children between the ages of 13 and 17.

Doyle added that at certain times, Instagram had a user registration system where the accounts of child users were set to “public” by default, thereby making public the social media content of child users. Users had to set the account to “private” manually though the account privacy settings. 

The Meta spokesperson disputed the inquiry, arguing that it “focused on old settings” the company updated about a year ago. The spokesperson added that the company has released a number of new features to “help keep teens safe and their information private.”

“Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them,” the spokesperson said. “While we’ve engaged fully with the DPC throughout their inquiry, we disagree with how this fine was calculated and intend to appeal it. We’re continuing to carefully review the rest of the decision.”

‘Proactive’ changes

The company added that in July 2021, it began to make all new under-18 accounts in the United Kingdom and European Union private by default. 

Instagram also rolled out several other changes since then that block people from tagging or mentioning teen accounts that don’t follow them and accounts of children under 16 will be defaulted into the ‘Less’ setting of Instagram’s Sensitive Content Control when they join the platform. 

The move was designed to make it more difficult for those accounts to come across “sensitive content or accounts” in the platform’s search, explore, recommendations and suggested account tabs. 

The spokesperson admitted that displaying contact information on business accounts was by design but noted that before 2019, users were informed that the information would be publicly displayed. 

After September 2019, Instagram made that feature optional and in the following years claimed it “proactively” notified teens with business accounts that their contact information was optional and could be removed. 

Now teens have to manually add contact information to their business profile as a way to “add friction to the process,” a Meta spokesperson said. 

The company argued that in recent years, Instagram has made a concerted effort to shield child-run accounts from adults with a variety of measures including banning adults from sending direct messages to teens who don’t follow them. 

Instagram also claimed to have developed technology that allowed them to identify “suspicious adult accounts” and stop them from finding or interacting with child accounts. 

They send warnings to teens who are already following adult accounts about conversing with adults, according to the spokesperson. 

The fine would be the second largest GDPR-related fine issued after the more than $800 million fine issued to Amazon last year. 

Meta has previously faced two other fines from European data privacy regulators, including a  $267 million fine against WhatsApp and a $17 million fine against Facebook.

Jonathan has worked across the globe as a journalist since 2014. Before moving back to New York City, he worked for news outlets in South Africa, Jordan and Cambodia. He previously covered cybersecurity at ZDNet and TechRepublic.