Information Commissioner publishes Age Appropriate Design Code

Online services with a UK connection need to be (re)designed with kids in mind

On 22nd January 2020, the UK Information Commissioner published her Age Appropriate Design Code. The code applies to organisations in the UK. It also applies on a worldwide basis to organisations that monitor kids in the UK, or where it's apparent that they intend to offer online services or goods to kids in the UK. The code is not limited to child-directed sites: it applies whenever it's more likely than not that under 18s will use the site. The code is expected to be fully effective from Autumn 2021.

The code is much wider than parental consent requirements

The GDPR contains rules requiring organisations to obtain parental consent in order to process data of kids below 13 – 16 (depending on the age selected by different EU member states). The scope of the code is much wider than this: it requires online services to be designed with the best interests of the child in mind. This must be the primary consideration in all associated processing of personal data. It is to be achieved through 15 key principles – which link to privacy by design and by default; transparency; and accountability. The code states that where online services are likely to be accessed by kids a DPIA is mandatory. Impact on children is to be considered at every stage and cannot be "bolted on" as an after-thought. Existing DPIAs must be updated to achieve this.

What is the status of the code and what are the sanctions for non-compliance?

The UK Data Protection Act 2018 places an obligation on the Commissioner to produce the code. Failure to comply with the code is not automatically a breach of the law. However, both the Commissioner and the UK courts are required to take it into account. As the Code represents the Commissioner's view on what is required when processing children’s personal data, failure to comply with the code may well lead to sanctions under data protection legislation. The Commissioner notes that this is an enforcement priority for her and that she will engage in pro-active audits.

The code applies to most websites, apps, online games and connected toys

The code applies to "information society services" which are likely to be accessed by kids. This would include websites and apps making products or services available online; news sites; games; education sites; messaging; content streaming; and connected toys and devices.

"Information society services" are services provided by electronic means, at a distance and at the individual request of the particular user. The services must be provided "for remuneration" – which would include services which are free to the user because they are ad-funded. Counselling and preventive services are excluded. Processing which isn’t subject to the GDPR (e.g. processing by law enforcement authorities) is also not affected. There is a useful flow chart at the end of the code, to give guidance on scope.

The code covers data collected from the chid, but also data inferred from this information or inferred from the child’s behaviour online.

The code applies to all online services "likely" to be accessed by under 18s

The code is deliberately not limited to child directed sites. It applies to online services if it is more likely than not that under-18s will access them. Here, the Commissioner recommends a common sense approach, looking at the nature of the service and how it is accessed. Market research or user provided evidence can be taken into account. The measures which a site takes to prevent kids accessing it can also be relevant.

On the internet, no-one knows you are a kid

The code does not prescribe how organisations should assess visitors age. It puts forward a range of options, from self-declaration, through to use of AI and requirements for provision of official ID documents. The code acknowledges that requirements to provide additional information to prove age may present privacy risks – the approach taken should reflect the risks of the service and should seek to incorporate privacy by design and data minimisation principles. For lower risk online services, self-verification may be sufficient: for high risk online services, independent verification may be appropriate. The code acknowledges this is a challenging and developing area.

Age- appropriate design must be appropriate to the age of the child

The code is not about one approach for kids and another for adults. Age appropriate design means just that. The code divides children into 5 developmental ages – 0- 5; 6 – 9; 10 – 12; 13 – 15; 16 – 17. It provides guidance in an appendix on developmental factors for each age group. In multiple parts of the code, it provides suggestions on how to tailor design with these different ages in mind. To take an example, a primary school child may actively need to be deterred from changing high privacy default settings, whereas a teenager might be better supported by clear and neutral information which helps them make their own decision.

Best interests of the child

The key principle in the code is that the child's best interest must be the primary consideration when designing online services. The UN Convention on the Rights of the Child is relevant in determining this. The Commissioner recognises the importance of children being able to access information and the importance of play (including online play).

The child's best interest is not the sole consideration: commercial interests are also allowed and are relevant. However, if there is tension between these, the bests interests of the child must come first. The code draws a strong link between this principle and fair and lawful processing under the GDPR.

Conversely, one of the 15 principles prohibits uses of personal data which are detrimental to the interests of the child. This would include processing which has been shown to be detrimental to the child's well-being, or where studies suggest that this could be the case. The code refers to compliance with other codes which contain requirements to prevent harm to children (such as the CAP Code on marketing, the press and broadcast codes and the OFT guidance for online games). The code also makes a point of noting that this principle will restrict use of "stickiness" – features designed to make it hard for a user to disengage.

Privacy by design and by default: data minimisation, limited data sharing and profiling, default settings and nudging

Relevant services must collect and retain the minimum personal data necessary to provide the service. Kids must be given choices over any processing which is not necessary. For example, a music streaming platform’s core service, for which processing is "essential", is the track download; whereas ancillary services such as providing recommendations or sharing users’ playlists would be non-essential processing that should be defaulted-off unless the child opts-in.

On this point, the Commissioner warns she’ll look "very carefully" at claims that a privacy setting cannot be provided because the personal data is needed to provide the core service. Accordingly organisations which over interpret "essential service" do so at their peril.

Processing of geolocation data is singled out, both as an example of this and as one of the 15 principles in its own right: geolocation data collection should be switched off by default, unless there is a compelling reason to collect data. If data has to be collected (e.g. to provide map based services), then this should only be while necessary for the service and the user should have to make an active choice to share beyond this. The service should use an obvious sign to show location data is being collected.

Profiling should be switched off by default unless there is a compelling reason to allow it – and it should only be used if there are measures to protect users from harm. The code gives the example that if you profile to serve personalized content or ads, then you must ensure no detrimental content is served. Any personal data you process to protect the child and to ensure content is suitable cannot then be used for incompatible purposes; commercialising data collected to protect children (or using it to target ads) would fall foul of this. The code also reminds us that profiling for targeted ads at under 13s is likely to need parental consent under art.8 of the GDPR.

Relevant services should not share data – even intra group – unless they can demonstrate a compelling interest, taking into account the best interests of the child. The code gives extreme examples of what is permissible: sharing with a school or police in the event of safeguarding concerns. However, selling data for commercial re-use is unlikely to be justifiable.

Notwithstanding this last requirement, the code notes that targeted advertising is not banned – but must be turned off by default. In addition, sites cannot bundle multiple options for profiling in one request for consent – so there should be separate options to turn on personalized content and personalized ads.
Where choices are offered, then the default option should always be the most privacy friendly. Kids will just accept whatever default settings are provided and therefore according to the Commissioner "it is of utmost importance…defaults set are appropriate for children and provide them with adequate protection". The code gives the examples of privacy settings for making content visible to others, or for allowing the service to use this for non-essential purposes(for example, marketing permissions) or for sharing it with others. Nudging children (via language or design) to change settings to a less privacy friendly option is not permitted: any nudging should be towards privacy friendly options and behaviours that support the child’s well-being and health (such as pause buttons to combat excessive screen time).

Say what you do (in a way that kids can understand)

Privacy notices – and other information (such as terms & conditions or policies and standards) must be concise, prominent and in clear language suitable for the age of the child. Just in time notices when data is collected are recommended.

This could mean multiple versions of a privacy notice – with options to explain that differently if its not clear, or to go into more detail if the information is too basic. Effective communication may mean cartoons or videos – or options to provide information by audio or in video. For very young children, information will need to be aimed at parents.

And do what you say

Relevant services must follow their own policies and standards – across the board and not just for privacy. So if you say you only display "content suitable for children" or that you "do not tolerate bullying" then you need to have adequate mechanisms in place to effectively deal with this. Kids (and the Commissioner) will say processing of personal data is unfair if you can't do this.

If you only rely on back-end processes such as user reporting to identify behaviour which breaches your policies then that needs to be made clear in your policies. Further if the risks are high then "light touch" or "backend" processes may not suffice.

Offer online tools

These could range from prompts to explain more or easier ways to exercise subject access (for example links or buttons saying "I want to access my information"), through to alerts if children want to report a safeguarding issue. For sites where this is possible, such alerts should be prioritised.

Parental controls may also be appropriate. However, relevant services should also include age appropriate information about these – and should include clear alerts or symbols to show if a parent is monitoring what the child is doing. The code recommends that relevant services also provide information to parents about the child's right to privacy – and that the child's expectations in this regard will grow as the child gets older.

Connected toys and devices need easy to use privacy controls

These could be on the device or online. Features to show when data is being collected (for example a light) are recommended.

Clear information about the toy’s use of personal data should be readily available including pre-purchase on the seller’s website, at point of sale and on set-up.

If devices are likely to be shared, then they should be designed so that different users can have different settings – with privacy friendly default settings for child users.

Organisations cannot absolve themselves of responsibility just because they outsource the connected functionality of the product - although in these cases, there may be shared responsibility with the outsourced vendor.

While the code applies to existing relevant services, it accepts that where products or toys are already on the market that there is no need to make changes to existing physical products.

Do (or redo) data protection impact assessments

GDPR makes these mandatory for processing which is "likely to result in a high risk". According to the Commissioner, this is inevitable where the code applies – so a DPIA is mandatory.

The DPIA should take into account a wide range of risks to kids – not just privacy risks, but also psychological and developmental risks, as well as social and emotional risks. Readers who are parents may be re-assured that the risk of undermining parental authority also has to be considered; as does risk of excessive screen time. All of this involves an assessment of likelihood and severity of risk.

The code notes that larger organisations will be expected to consult as part of their DPIAs – this could range from user surveys to a public consultation.

Organisations offering child-focused services (or services which can be expected to be widely used by kids) will also be expected to take advice from experts in children’s rights and developmental needs. 

Where organisations don’t consult on DPIAs because they think it’s unnecessary, disproportionate or just not possible, then they need to document their reasons for the decision and be prepared to justify the conclusion to the Commissioner.

There is a template DPIA in an Annex to the Code.

Be ready to prove compliance

In line with the GDPR principle of accountability, organisations must not only comply with the code but must also be in a position to demonstrate compliance. Given the Commissioner intends to carry out audits proactively, affected organisations would be well advised to have an audit trail in place - such as DPIAs, internal policies, training records and privacy UXs - in case the Commissioner comes knocking. The accountability program should, according to the code, be driven by the DPO and be overseen by senior management at Board level.

We will be hosting a webinar on Wednesday 29 January to explain key provisions in the code and what organisations should do now. Click here to RSVP.

Latest insights

More Insights
cipa

Management of royalties received by inventors employed in Hungary

Apr 23 2024

Read More
Stethoscope and keyboard on blue background

IP Symposium Ireland - The Podcast

Apr 23 2024

Read More
test tubes with coloured liquid

Regulating non-therapeutic human experimentation: a new legal framework in Belgium

Apr 22 2024

Read More