Instagram Fined €405 Million for Mishandling Teens’ Data

Instagram Fined €405 Million for Mishandling Teens’ Data

Ireland’s Data Protection Commission (DPC) has issued a €405 million fine to Meta, the parent company of Instagram, Facebook and WhatsApp, over the way it handled children’s data.

The penalty follows an investigation into an Instagram setting that allowed teenagers to set up business accounts without being warned that their contact details would become instantly visible to anyone on the web.

‘Public’ by default

The default setting showed teens’ phone numbers and email addresses. Instead, Instagram should’ve made those accounts private by default, to comply with child safety laws.

“We adopted our final decision last Friday and it does contain a fine of €405m,” a DPC spokesperson said. “Full details of the decision will be published next week.”

Meta fired back saying, “This inquiry focused on old settings that we updated over a year ago, and we’ve since released many new features to help keep teens safe and their information private.”

“Anyone under 18 automatically has their account set to private when they join Instagram, so only people they know can see what they post, and adults can’t message teens who don’t follow them,” a spokesperson for Instagram’s parent company said. “While we’ve engaged fully with the DPC throughout their inquiry, we disagree with how this fine was calculated and intend to appeal it. We’re continuing to carefully review the rest of the decision.”

Full blown war between Meta and regulators

Since the European Union’s GDPR was introduced in 2018, Meta’s relationship with the Ireland-based watchdog has been rocky, to say the least.

In September of 2021, for example, the DPC slapped Meta with a €225 million fine for failing to clarify how certain policy changes within WhatsApp would affect data sharing with the parent company.

And in March this year, Meta got yet another slap on the wrist to the tune of €17 million over an inquiry into a multitude of data breaches.

Last year, Meta abandoned plans to develop a children-only version of Instagram over discussions about the impact such an app could have on children’s mental health. At the time, a whistleblower claimed Facebook’s own research had showed how Instagram could especially affect girls on issues such as body image and self esteem.

In a filing to the US Securities and Exchange Commission in February, Meta said that it “will likely” altogether pull Facebook and Instagram from Europe if it’s not allowed to transfer, store and process Europeans’ data on servers in the United States.