TikTok Fined £12.7 Million For Violating Children Data Protection

By   Olivia William
Writer , Information Security Buzz | Apr 04, 2023 06:15 am PST

TikTok was fined £12.7 million (~$15.7M) for violating UK’s children data protection law, particularly child protection requirements. The Information Commissioner’s Office (ICO) said today that the video-sharing site “did not do what is expected” to examine who was using their platform and remove underage youngsters.

The ICO found 1.4 million underage UK TikTok users between May 2018 and July 2020, despite its terms of service requiring users to be 13 or older. TikTok would have needed parental consent to legitimately process these minors’ data because the UK’s children data protection statute caps children’s consent at 13 years old (which the company did not do).

“We punished TikTok for providing services to UK children under 13 and handling their personal information without their knowledge or parental or legal guardian’s permission. “We anticipate TikTok to continue its efforts to identify and remove underage youngsters from its network,” an ICO spokeswoman said.

The ICO also determined that TikTok failed to give consumers clear, easy-to-understand information about how their data is collected, used, and shared, violating the UK’s General Data Protection Regulation (GDPR).

“Without that information, users of the site, Children in particular, were unlikely to be able to decide whether and how to participate in it in an educated manner.” the ICO said in a news release announcing the punishment for mishandling children’s data. UK information commissioner John Edwards stated: Laws protect minors online and offline. TikTok violated them. Hence, TikTok collected and used personal data from an estimated one million under-13s. Their data may have been tracked and profiled to provide damaging, unsuitable content on their next scroll.

TikTok should’ve known. They didn’t examine who was utilizing their platform or delete underage kids. TikTok should have improved. These failures may have caused our £12.7m fine. The ICO’s enforcement was discussed with TikTok. We were informed that the corporation is evaluating the decision.

A spokesman for TikTok said: 13-year-olds can use TikTok. We invest heavily to keep under-13s off the platform, and our 40,000-strong safety team works 24/7 to keep our community safe. While we disagree with the ICO’s judgment, which covers May 2018–July 2020, we are glad that today’s fine is under half the amount proposed last year. The decision will be reviewed, and future steps will be considered.

TikTok claims it has addressed the faults it’s being penalized for today. However, it continues to use an age gate, which allows underage users to lie about their age to obtain an account. But, it claims it combines this with beefed-up procedures and training for its safety moderation team to check for signals an account may be used by a child under 13 so they may flag and evaluate accounts. It also responds quickly to parents’ requests to remove underage accounts and uses keywords and in-app data to identify probable underage accounts.

In the last three months of 2022, TikTok reported Almost 17 million suspected juvenile accounts had been deleted globally, although this information is not broken down by nation. It also offers family pairing to help parents monitor kids’ usage.

TikTok Violates The UK GDPR On Children Data Protection

The social networking platform was found to have violated the UK GDPR on the lawfulness, transparency, and fairness grounds for two years, although it only faces a penalty in the double digits, much below the theoretical maximum (4% of global annual revenue).

In September, the ICO published a provisional finding suggesting it might punish the corporation up to £27M ($29M) for various potential breaches. The regulator’s decision not to pursue a preliminary finding linked to the improper use of special category data after TikTok’s submissions decreased the penalties significantly.

There is a higher standard for the lawful use of special category data, which includes information about racial or ethnic origin, political beliefs, sexual orientation, religion, and health. The ICO accused TikTok of unlawfully processing this data last year. Yet, the corporation was able to allay its concerns.

The ICO’s special category data investigation was dropped for unknown reasons. In response to TechCrunch’s questions, a regulator spokeswoman said:

TikTok’s arguments persuaded us to drop the provisional verdict on special category data misuse. Social media corporations’ usage of special category data is still important to the ICO. But we need to be strategic with our resources, and the Commissioner exercised his discretion not to pursue the provisional finding about the unauthorized use of special category data. This possible infringement is not included in the final fine of £12.7 million, which was why the provisional fine was reduced to £12.7M. This fine is based on our Regulatory Action Policy.

The ICO’s silence over systematic breaches by the behavioral advertising sector may hinder its capacity to pursue individual platforms that use data-dependent tracking, profiling, and ads-microtargeting to monetize a “free” service.

The UK watchdog has prioritized data protection for children. It established an age-appropriate design code tied to GDPR compliance in recent years under criticism from advocacy organizations and UK legislators (and consequently to the possibility of fines for those who disregard the advised norms). In September 2021, the Kids’ privacy and safety Code was enforced. The ICO has begun investigating, but there has yet to be a wave of enforcement.

Since the UK is no longer a member of the EU, the ICO only enforces the GDPR in the UK. TikTok is still under investigation in the EU for how it handles children’s data.

Whereas the ICO’s enforcement of TikTok is on behalf of exclusively UK users, thus some of the difference in amount between the penalties assessed. Last year, Instagram was fined €405M by the EU for exploiting children’s data. Even while the fine in that instance reflected cross-border data processing activity within the bloc of 27 member states. TikTok’s handling of children’s data was investigated by Ireland’s Data Protection Commission (DPC) in September 2021.

Conclusion

Social media company TikTok was fined £12.7 million ($15.75) by the UK’s Information Commissioner’s Office (ICO) for repeatedly breaking the nation’s children data protection laws. TikTok failed to get parental permission from users under 13 and did not conduct sufficient checks to detect and stop underage users from accessing the social media app. There are rules to guarantee that our kids are just as safe online as offline. TikTok broke these rules, according to UK Information Commissioner John Edwards. According to the privacy watchdog, TikTok collected and used the personal information of an estimated one million children under 13 due to inadequate response.

That implies that their information may have been used to track them and create profiles of them, maybe delivering nasty stuff on the very next scroll, said Edwards. The ICO also noted that TikTok typically needed to adequately warn platform users about how personal data is collected, utilized, and shared. Also, the corporation did not ensure that UK customer data was handled “lawfully, equitably, and in a transparent manner.” It’s important to remember that the ICO originally intended to penalize TikTok £27 million ($33 million) for using special category data without permission. Nevertheless, the ICO finally chose not to appeal the provisional finding.

Subscribe
Notify of
guest
0 Expert Comments
Inline Feedbacks
View all comments

Recent Posts

0
Would love your thoughts, please comment.x
()
x