IRISH DATA PROTECTION COMMISSIONER FINES TIKTOK €345M

Findings from the regulator against TikTok provide some important take-aways for any online business

On 15 September 2023 the Irish Data Protection Commission (DPC) announced that it was fining TikTok Technology Limited €345m for various breaches of the GDPR by the TikTok platform in relation to its processing of children’s data between 31 July and 31 December 2020.  The DPC also issued a reprimand and an order for TikTok to bring its processing into compliance within three months.

The DPC found that:

  • The profile settings for child user accounts were set to public by default, meaning anyone (on or off the platform, including non-TikTok users) could view content posted by child users, in breach of:
    • the data minimization principle in Article 5(1)(c) (Finding 1)
    • the requirement to implement data protection by design and by default in Article 25(1) and (2) (Finding 1)
    • the requirement to implement appropriate measures to comply with the GDPR in Article 24(1) (Finding 2)
  • The ‘Family Pairing’ setting allowed non-child users (who could not be verified as a parent or guardian) to pair their account to a child user’s account and enable Direct Messages for child users aged 16+, posing severe possible risks to child users, in breach of:
    • the data security principle in Article 5(1)(f)
    • the requirement to implement data protection by design and by default in Article 25(1)

(Finding 3)

  • The public-by-default profile settings for child users posed several possible risks to children under 13 who gained access to the platform, in breach of the requirement to implement appropriate measures to comply with the GDPR in Article 24(1) (Finding 4)
  • TikTok failed to provide adequate information to child users about who can see their content, and specifically that an indefinite audience, including non-registered users, would be able to view their personal data, in breach of the rules on transparency and provision of information in Articles 12(1) and 13(1)(e) (Finding 5)
  • TikTok used ‘dark patterns’ by nudging users towards choosing privacy-intrusive options during the registration process and when posting videos, in breach of the lawfulness, fairness principle and transparency principle in Article 5(1)(a) (Finding 6)

TikTok has already launched legal challenges and been granted permission to challenge the fine (see TikTok challenges €345m DPC fine for violations of child privacy rules – The Irish Times, TikTok fights back over €345m DPC privacy fine – The Irish Times and TikTok can pursue challenge to €345m fine by Irish data commission over children’s privacy – The Irish Times) so no doubt we’ll be hearing more on this in due course.

TikTok points out that the DPC decision is based on features and settings that were in place three years ago, and which TikTok had made changes to before the investigation began, such as setting all under 16 accounts to private by default with no choice to have a public account and only allowing comments on their videos by “Friends” or “No One”.

What this means for businesses

Whilst most businesses aren’t running a global social media platform with a billion users consisting mostly of children and young people, there are some interesting points in the decision that really drill down into the nitty-gritty of data protection compliance and from which we can draw some practical warnings and guidance for all businesses, as explained below.

‘Dark patterns’

Finding 6 concerned TikTok’s use of ‘dark patterns’.  Dark patterns are techniques used by platform/app/website operators to present privacy options in a way that nudges users towards making more privacy-intrusive choices and/or makes it more difficult for them choose the more privacy-protective options.

The techniques identified in the TikTok decision include:

  • Use of the term “Skip” on the Registration Pop-up – this was considered to “trivialize” the decision to opt for a private account. 
  • The “Skip” option was on the right-hand side – it was considered that this leads to a majority of users choosing it, “as internet and social media users are used to the button on the right side leading them to fulfil a step and go further (muscle memory)”.
  • On the Video-Posting Pop-Up, the option to post video publicly is not only displayed on the right, but also shown in bold darker text – nudging users to select it. 
  • The Video-Posting Pop-Up referred to the possibility of changing preferences in the privacy settings but didn’t directly link to those settings, so users would have to select ‘Cancel’ then go through the trouble of looking for the privacy settings, where they would then need to find the exact setting that concerns the visibility of the account/switching to a private account – it was considered that this makes it less likely that users will change their settings and more easy to just go along with the pre-set public settings.

The European Data Protection Board (EDPB) Guidelines on Data Protection by Design and by Default say:

  • “Options should be provided in an objective and neutral way”.
  • Controllers should not “present the processing options in such a manner that makes it difficult for data subjects to abstain from sharing their data” or “nudges the data subject in the direction of allowing the controller to collect more personal data than if the options were presented in an equal and neutral way”.
  • Controllers should not “make it difficult for the data subjects to adjust their privacy settings and limit the processing”.

The EDPB agreed that the TikTok Registration Pop-Up and the Video Posting Pop-Ups were “nudging the user to a certain decision” and leading them “subconsciously to decisions violating their privacy interest”.

These dark patterns are something of a hot-topic in the UK at the moment: in August the Information Commissioner’s Office (ICO) and Competition & Markets Authority (CMA) issued a joint paper on Harmful Designs in Digital Markets, setting out the harms that can arise when certain types of online practices are used to present information and choices to consumers about the collection and use of their personal information (see ICO-CMA joint paper on Harmful Design in Digital Markets | DRCF).  This is worth a read, as it sets out further examples of ‘bad practice’ to avoid and why they are seen as problematic by both the ICO and the CMA.

This finding shows the importance of the way privacy options are presented to users, in terms of the exact wording used, visual impact and user experience.  The examples cited in the TikTok decision and ICO-CMA joint paper provide useful practical guidance; but applying general rules such as “don’t try to manipulate your users” and “make it easy for your users to select the most privacy-protective settings” when designing user privacy option interfaces for your platform/website/app will also help you to be compliant on this.

Bad actors

A key issue contributing to the findings regarding TikTok’s risk assessments and non-compliant processing was that the public-by-default settings meant that children could become targets for bad actors, leading to the risk of “deleterious activities” including online exploitation, grooming, social anxiety, self-esteem issues, bullying or peer pressure resulting from disclosure of personal data in videos; as the DPC put it: “the risk of bad actors misusing the TikTok platform to acquire personal data in a manner that is deleterious to the rights and freedoms of data subjects.”

Despite TikTok’s protestations, the DPC confirmed that it wasn’t attributing actions of bad actors to TikTok, but that TikTok has a responsibility to implement appropriate measures to prevent its platform settings being used for purposes other than that intended by TikTok.  Public-by-default settings mean that videos containing personal data are exposed to an indefinite audience, including non-TikTok users, giving rise to risks related to possible communication between child users and dangerous individuals, both on and off the TikTok platform.

These findings show the importance of factoring in the potential for bad actors to carry out harmful activities using your platform/website/app, both in your data protection impact assessments (DPIAs) and in the way you design and build your platform/website/app, and highlights that supervisory authorities expect businesses to do everything within their power to prevent their platforms/websites/apps being used by bad actors for harmful purposes that are incompatible with the purposes for which the businesses collect and process personal data.

Doing data protection impact assessments (DPIAs) properly

Part of the reasoning behind Finding 4 (public-by-default profile settings for child users breaching the requirement to implement appropriate measures to comply with the GDPR in Article 24(1)) was that there was no documented evidence of TikTok assessing the specific risk of children under 13 accessing the platform and the further risks that may arise from this: it wasn’t addressed in TikTok’s relevant DPIA titled “Children’s Data and Age Appropriate Design” or any other DPIA.  As the DPC said: “It is not clear why [TikTok] has not done so.” 

Perhaps TikTok overlooked this risk because (a) the TikTok platform is expressly not intended for children under the age of 13 and (b) TikTok took it as read that its age verification processes were adequate to prevent under 13s from accessing the platform.

Whatever was behind this oversight, this finding highlights the importance of thinking widely about potential risks when completing DPIAs and ‘assuming the worst’ when framing the scope of a DPIA, including that your platform/website/app user terms may be ignored and your age verification (or other user verification) processes may be inadequate, cheated or otherwise rendered useless. 

Doing privacy notices properly

Part of the reasoning behind Finding 5 (failure to provide adequate information to child users about who can see their content, in breach of the rules on transparency and provision of information in Articles 12(1) and 13(1)(e)), was that the DPC considered certain words/terms used in TikTok’s privacy notices to be inadequate:

  • may” – this was considered to be a conditional term that failed to communicate who would definitely receive children’s personal data.
  • third parties” – this was considered to be an imprecise umbrella term that is unclear and opaque as it doesn’t provide children with specific information about recipients of their personal data.
  • everyone” and “anyone” – these were also considered to be vague and opaque, as it wasn’t clear if they referred to all TikTok users or anyone who could access the platform via the website. 

These types of terms are regularly seen in privacy notices, but the message from the TikTok decision is that such vague and opaque words and terms aren’t good enough.  This shows the importance of (a) knowing exactly which individuals and organisations have access to personal data from your platform/website/app, and (b) describing those recipients clearly, definitively and honestly in your privacy notices.

Get in touch

Do get in touch with us if any of this has prompted you to think that you may need to make changes to your DPIAs, privacy notices (including pop-up notices and user options), platform/website/app design/settings or data processing practices.  We’re experienced in reviewing and assessing these materials and processes, providing practical advice and drafting appropriate wording to help businesses comply with their data protection obligations as platform/website/app operators.  


For more information on the relevant areas mentioned in this article please click below:

REQUEST A CALLBACK

We will use this information to respond to your enquiry.  Please see our Privacy Notice for further information about our use of personal data.