ICO FINES TIK TOK £12.7 MILLION FOR UK GDPR BREACHES

The UK Information Commissioner’s Office (ICO) has fined TikTok Information Technologies UK Limited and TikTok Inc (TikTok) £12.7 million for breaches of data protection law, including failing to use children’s personal data lawfully.

TikTok’s breaches 

The ICO found that TikTok breached the UK General Data Protection Regulation (UK GDPR) between May 2018 and July 2020 by:

  • Providing its services to UK children under the age of 13 and processing their personal data without consent from their parents or carers.  
  • Failing to provide proper information to people using the platform about how their data is collected, used and shared in a way that is easy to understand.  Without that information, users of the platform, in particular children, were unlikely to be able to make informed choices about whether and how to engage with it.
  • Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner.

The ICO estimates that TikTok allowed up to 1.4 million UK children under 13 to use its platform in 2020, despite TikTok’s own rules not allowing children under 13 to create an account and UK data protection law prohibiting organisations from offering information society services to children under 13 without consent from their parents or carers.

According to the ICO investigation, concerns were raised internally by senior TikTok employees regarding underage children using the platform but TikTok “did not do enough” to check who was using their platform and remove underage child users.

The fine was originally set at £27 million (see ICO news from September 2022), but following representations from TikTok the ICO decided not to pursue a provisional finding related to unlawful use of special category data (such as ethnicity, political and religious beliefs and sexuality), reducing the final amount of the fine to £12.7 million.

Nevertheless, this is one of the largest fines handed down by the ICO to date, coming in third after British Airways (£20m) and Marriott Hotels (£18.4m) in 2020.

What does this mean for TikTok?

Perhaps unsurprisingly, TikTok doesn’t agree with the ICO’s decision, although it is “pleased that the fine announced… has been reduced to under half the amount proposed last year.”  TikTok is considering its next steps and may appeal the value of the fine.

However, given the £64 billion revenue reported to have been made by TikTok’s parent company ByteDance in 2022, the fine only amounts to around 0.02% of the group’s 2022 global annual revenue.  When viewed against the maximum fine that the ICO can issue (the higher of £17.5 million or 4% of total annual worldwide turnover in the preceding financial year – so around £2.56 billion based on the TikTok group’s 2022 revenue), the fine doesn’t look so serious. 

And this isn’t the first enforcement action TikTok has faced:

  • In 2019, the US Federal Trade Commission fined TikTok $5.7m fine (a record at the time) for improper data collection from children under 13.
  • Various countries have banned TikTok from government devices recently over security concerns about Chinese government access to data collected by the app, including the UK, USA, EU, New Zealand, Belgium, Denmark, Canada and Taiwan.
  • The US government recently said that TikTok should be sold or face a potential country-wide ban and asked ByteDance to divest itself of TikTok to create a clear break from China. 
  • In 2020, India imposed a nationwide ban on TikTok (along with 58 other Chinese-owned apps), believing the apps to be “prejudicial to sovereignty and integrity of India, defence of India, security of state and public order”.  
  • Afghanistan has also banned TikTok to “prevent the younger generation from being misled”.

However, none of these actions seem to have diminished TikTok’s fortunes, with no signs of waning enthusiasm from either its 1bn users or the many companies that have flocked to TikTok for its unique marketing and e-commerce opportunities and access to young consumers.

Most of TikTok’s young users won’t know or care about the ICO’s fine (or any other enforcement action).  And any bans on content grounds could arguably make TikTok even more of a forbidden fruit.  

As long as TikTok remains accessible to and popular with young people, and advertisers can take advantage of TikTok’s innovative content algorithms, user profiling and methods of communicating products and brands to users, it’s unlikely that TikTok’s advertising revenue is going to take a hit.

What does this mean for your business?

Since the ICO’s investigation of TikTok, the ICO has issued its “Children’s code” (formally the “Age appropriate design code”), a statutory code of practice aimed at helping protect children in the digital world. 

It applies to online services such as apps, online games and web and social media sites likely to be accessed by children, stipulating stricter rules for platforms to follow when handling children’s personal data. 

The ICO clearly intends to focus its regulatory powers on protecting children online.  In its Annual Action Plan: October 2022 – October 2023, the ICO says that:

“For the period until October 2023, we will focus our investigation and project work on the following issues:

Children’s privacy – we will continue to enforce our Children’s code and influence industry to ensure children benefit from an age-appropriate online experience.  We will:

  • press for further changes by social media platforms, video and music streaming sites and gaming platforms to correctly assess children’s ages and conform with the Children’s code guidelines about profiling children and sharing their data.  We will continue to push for improved transparency and use of privacy notices children can understand.  We will also consider changes to the code required by legislative reform on data protection and to promote closer policy alignment with the Online Safety Bill; and
  • continue our investigations where organisations are not conforming with the code and take appropriate enforcement action.  We will share examples of good practice from our direct engagement with organisations to encourage wider positive change.  We will work with Ofcom through the DRCF to promote joined up and effective regulation to protect the privacy and safety of children online.”

If your business provides an online service likely to be accessed by children, it will need to make itself familiar with the content and requirements of the Children’s code – and comply with it.  The ICO’s Children’s Code Hub is a good place to start.

To avoid the breaches that led to TikTok’s fine, your business will need to ensure that it:

  • Has adequate age verification processes in place.
  • Obtains valid consent from parents/carers of users aged under 13.
  • Has robust record-keeping processes in place to enable it to demonstrate the above.
  • Implements systems that can identify and remove underage users for whom valid parental/carer consent hasn’t been obtained.
  • Provides proper information to children about how their data is collected, used and shared, and ensure this is easy for children to understand so that they can make informed choices about whether and how to engage with the service.
  • Processes children’s personal data lawfully, fairly and in a transparent manner – which means complying with the specific requirements of the Children’s Code on top of the more generally applicable requirements of the UK GDPR.

The ICO’s penalizing of TikTok, the requirements of its Children’s Code and the regulatory targets stated in its Action Plan indicate that the ICO intends to take a firm stance over the protection of children’s personal data online.  

As TikTok’s breaches took place before the Children’s Code was implemented, it’s possible that the ICO could impose more substantial fines on organisations for similar breaches that occur after the code came into effect.

In addition to the ICO’s regulatory activity, the Online Safety Bill, which is currently making its way through parliament, will impose further requirements on providers of online services, including provisions aimed at making social media companies legally responsible for keeping children and young people safe online.

What does this mean for young TikTok users? 

TikTok says it has changed its practices since the ICO’s investigation and now uses more signals than just a users’ self-declared age when trying to determine how old they are, including training its moderators to identify underage accounts and providing tools for parents to request the deletion of their underage children’s accounts.

The ICO acknowledges such improvements in its blog about the effects of its Children’s Code

“As you’d expect it’s already having an impact on these services.  Facebook, Google, Instagram, TikTok and others have all made significant changes to their child privacy and safety measures recently.” 

TikTok’s response to the ICO’s consultation on the Children’s Code affirms its commitment to creating a safe online environment and points to TikTok’s “robust array of industry-leading safety features designed to protect our users from misuse”.

There will probably be a significant number of UK parents who remain unconvinced about TikTok’s statements, intentions and delivery regarding children’s use of the platform, whose underage children have still managed to download the app using their relatively superior tech-savviness to get around TikTok’s controls (and their parents’ prohibitions and content/screentime restrictions) and for whom the parental controls remain a mystery.  

Parents may feel there’s nothing magic about the number 13: children don’t suddenly become immune to the negative influences of content they are exposed to on TikTok, or able to resist the addictive, time-sinking allure of TikTok’s winning formula on the day they turn 13.  It was the UK government, not parents, who decided to reduce the age for which parental consent is required from 16 to 13 – the lowest age permitted under the EU GDPR.

Parents may also struggle to have faith in the effectiveness of TikTok’s “robust array of industry-leading safety features” when their children report watching videos from the likes of Andrew Tate and others promulgating homophobic or misogynistic messages and misinformation.

Add to that the emotional pressure parents experience to let their children get TikTok to avoid them becoming social pariahs among their friends and peers or being bullied.  Weighing up the conflicting potential mental harms caused by their children having or not having TikTok can be fraught and exhausting.

Neither the ICO fine nor TikTok’s safety features seem to address the risks from TikTok’s data processing generally that applies to all users: the app hoovers up data including geolocation, age, content engagement, account follows and topics explored to create valuable and detailed marketing profiles of users.  The ICO’s own report of the TikTok fine and third party reports have all focused on the underage access and consent elements of the breaches, and as the ICO hasn’t yet published its Enforcement Notice, it’s not clear what processing by TikTok amounted to “Failing to ensure that the personal data belonging to its UK users was processed lawfully, fairly and in a transparent manner”.  However, we might speculate that profiling, direct marketing and international transfers carried out by the platform featured in the ICO’s investigation.

We also don’t yet know the detail of the “provisional finding related to unlawful use of special category data” that was dropped by the ICO and led to a reduced fine: e.g. what special category data was TikTok using, does TikTok process special category data or not, and if it does, how is it lawful?

Parents’ main concerns about TikTok seem to be (i) the content children are viewing, (ii) the amount of time that children spend looking at it and (iii) the collection and use of children’s data (but not necessarily in that order).  The recent enforcement actions against TikTok seem unlikely to allay these concerns.

Yet children have become strident defenders and apologists for TikTok.  Because they love it.  Because it was designed to be addictive, fun and easy to keep them viewing for as long and as frequently as possible.  And they make some persuasive points: TikTok enables young people to communicate globally about issues that affect them, express themselves and ‘be’ themselves, find a community and ‘safe space’ away from what can be oppressive or abusive school and home environments, be creative, get their performing, musical and artistic skills noticed by the right people and, for successful content producers, earn money.  TikTok made a similar point in its response to the Children’s code consultation: “Therefore we would encourage the ICO to consider a more proportionate approach that will allow information society service providers to balance the need to respect privacy on the one hand, while upholding obligations under the UNCRC to guarantee the rights of the child to express and explore freely”.

Like it or not, parents are likely to have to deal with their children using TikTok for a while yet, and children’s TikTok use is likely to be an important and formative experience that shapes this generation’s knowledge, understanding and appreciation of our world and themselves and influences their life paths.

 


Please get in contact with us if you’re unsure whether your business complies with the requirements of the Children’s code and UK GDPR regarding child users of your service/app/platform.  We can help you:

  • Review your technical processes and data processing practices to identify any compliance risks.
  • Suggest ways of making those processes and practices compliant.
  • Produce/update your privacy notices and consents so that they meet the relevant transparency requirements and can be understood by your users.  
  • Determine the appropriate legal bases for your processing.
  • Produce/update your internal documentation to help you comply with – and demonstrate your compliance with – the Children’s code and UK GDPR, such as data protection impact assessments, processing records, data security policies and appropriate policy documents for processing special categories of personal data.

For more information on the relevant areas mentioned in this case study, please click below:

REQUEST A CALLBACK

We will use this information to respond to your enquiry.  Please see our Privacy Notice for further information about our use of personal data.