PDPC Advisory Guidelines on the Use of Personal Data in AI Recommendation and Decision Systems

Key Takeaways:

  • On 1 March 2024, the Personal Data Protection Commission (“PDPC”) published the Advisory Guidelines on use of Personal Data in AI Recommendation and Decision Systems (“Advisory Guidelines”).
  • The Advisory Guidelines are not legally binding, but in carrying out its enforcement of the Personal Data Protection Act (“PDPA”), the PDPC is likely to take positions which are consistent with these Advisory Guidelines. 
  • Broadly, the Advisory Guidelines explain when it may be appropriate for organisations to rely on certain exceptions under the PDPA when using personal data to develop AI Systems, and sets out recommended data handling and accountability measures when deploying AI Systems.
  • This article updates our previous article on the version of the Advisory Guidelines published for the PDPC’s public consultation held in 2023.

Background:

On 1 March 2024, the PDPC published the Advisory Guidelines on the use of personal data for the design and deployment of systems that embed machine learning models (“AI Systems”). As the Advisory Guidelines are intended to apply specifically to AI recommendation and decision systems, the document does not address use cases relating to the training and deployment of generative AI systems.

In keeping with other advisory guidelines issued by the PDPC, the Advisory Guidelines are not legally binding, but the PDPC is likely to interpret and enforce the PDPA in a way which is consistent with these Advisory Guidelines. A stated purpose of the Advisory Guidelines is to “provide organisations with certainty on when they can use personal data to develop and deploy” AI Systems. 

Who is impacted?

The Advisory Guidelines apply to any organisation which collects and/or uses personal data to develop and deploy AI Systems.

The Advisory Guidelines also provide recommendations applicable to data intermediaries who are engaged by organisations to provide professional services for the development and deployment of bespoke or fully customisable AI Systems.

What do the Advisory Guidelines say?

Key points covered by the Advisory Guidelines include:
  1. Business Improvement Exception and Research Exception. AI developers may rely on exceptions to the PDPA requirement for consent where personal data is collected or used for certain purposes relating to the training of AI Systems. Exceptions which may be relevant include where personal data is processed for business improvement purposes, per Part 5 of the First Schedule of the PDPA (“Business Improvement Exception”), or for research purposes, per Part 2 of the Second Schedule of the PDPA (“Research Exception”). A key difference between these exceptions is that the Business Improvement Exception only caters for sharing of data with related companies, whereas the Research Exception does not have a similar restriction. 

    The Business Improvement Exception may be relevant when an organisation’s collection or use of personal data is to develop or enhance AI Systems with the aim of improving operational efficiency by supporting decision-making, or to offer new or more personalised products and/or services. Examples include developing or enhancing recommendation engines in social media services to offer targeted content to users based on their browsing history, or job assignment systems which automatically assigns jobs to platform workers.

    The Research Exception may be relevant when conducting commercial research to advance science and engineering without any immediate application to an organisation’s products, services, business operations or market.

  2. Data Protection Considerations. The Advisory Guidelines advise organisations to implement appropriate legal, technical and process controls when handling personal data to develop or enhance AI Systems. For example, organisations should only use personal data containing attributes required to train and improve an AI System, and only use the volume of personal data necessary to train the AI System based on relevant time periods and any other relevant filters. Where possible, organisations should also pseudonymise or de-identify personal data as a basic control, and are encouraged to anonymise their datasets.
  3. Consent and Notification. Where organisations deploy AI Systems to provide recommendations, predictions or decisions based off individuals’ personal data, the Advisory Guidelines reiterate that they must comply with consent and notification obligations under the PDPA unless relevant exceptions apply. The Advisory Guidelines outlines what information individuals should be provided with and how the information should be provided.
  4. Legitimate Interests. The PDPC noted that the legitimate interests exception, per Part 3 of the First Schedule of the PDPA (“Legitimate Interests Exception”) may be applicable in relation to certain uses of personal data where AI Systems are deployed. In particular, the PDPC provided the example of the processing of personal data as input in an AI System for the purposes of detecting or preventing illegal activities. It is noteworthy that the PDPC did not refer to the Legitimate Interest Exception in relation to the development, testing and monitoring stage of AI System implementation. 
  5. Accountability. The Advisory Guidelines recommend that organisations should be transparent and provide information in their written policies on relevant practices and safeguards to achieve fairness and reasonableness in their use of AI Systems. Examples are provided of information which might be included in such policies. Although the PDPA only requires organisations to make information on their data protection policies available upon request, the Advisory Guidelines recommend that organisations pre-emptively make such policies available online in order to build trust with individuals and demonstrate accountability in their compliance with the PDPA. 
  6. Procurement of AI Systems. A set of recommendations is also provided specifically for third-party service providers that provide professional services for the development and deployment of bespoke or fully customisable AI Systems. Such service providers are typically data intermediaries, and the Advisory Guidelines provide a list of recommendations to help such service providers comply with their own obligations, as well as to support the organisations which engage these them, with PDPA compliance. 

What’s next?

During her Committee of Supply 2024 speech, Minister for Communications and Information Josephine Teo stated that the PDPC will next be considering the provision of guidance on the use of personal data to train generative AI systems. If such guidance is provided, it is likely that a public consultation will be held on these proposed guidelines. 

If you have any questions or would like to discuss any issues, please do not hesitate to contact us.

This article is produced by our Singapore office, Bird & Bird ATMD LLP. It does not constitute legal advice and is intended to provide general information only. Information in this article is accurate as of 4 March 2024.

SIGN UP FOR OUR CONNECTED NEWSLETTER FOR A MONTHLY ROUND-UP FROM OUR REGULATORY & PUBLIC AFFAIRS TEAM.

Latest insights

More Insights
Curiosity line teal background

China Cybersecurity and Data Protection: Monthly Update - April 2024 Issue

Apr 26 2024

Read More
Suspension bridge over water at sunset

Bring out the wine and cheese: Enhanced protection for European GIs in New Zealand

Apr 26 2024

Read More
Green paper windmill

Green Gold: Navigating Mandatory Climate Disclosure and ESG Strategies

Apr 26 2024

Read More