Where unsolicited direct marketing to any person is carried out through the sending of electronic mail (i.e., any text, voice, sound or image message including SMS text message), the ePrivacy Directive (as transposed in the relevant EU Member State) will apply to such communications. The ePrivacy Directive is not specifically directed at communications made to children but, regardless of whether the communications is sent to an adult or a child, the general rule is that the consent of the individual recipient is required. This must be GDPR-standard consent) although there may be exemptions which may be relied upon in limited circumstances.
National rules which transpose the ePrivacy Directive set out specific requirements as regards direct marketing; these vary across the respective EU Member States.
Contextual advertising is generally understood to be advertising that does not involve the processing of personal data, however where it does involve such processing the rules of the GDPR (and relevant EU Member State implementing the GDPR apply).
Contextual advertising is subject to general EU rules, including the Unfair Commercial Practices Directive (UCPD) and the Audiovisual Media Services Directive (AVMSD), which prohibit unfair, misleading, or aggressive practices that exploit the inexperience or credulity of minors. Advertisers must also ensure that such communications are clearly recognisable as commercial content and do not encourage unsafe or socially irresponsible behaviour.
In this context, contextual advertising must not rely on or include so-called “dark patterns” – manipulative design choices aimed at exploiting users’ vulnerabilities, including those of children. While there is currently no express ban on dark patterns at EU level, the European Commission is actively preparing the Digital Fairness Act, expected in 2026, which is likely to introduce explicit prohibitions on dark patterns targeting minors. In the meantime, such practices may already be prohibited under the UCPD where they amount to misleading or aggressive commercial practices, and some EU Member States have case law or sectoral guidance on this point.
Finally, contextual advertising to children must also comply with any applicable EU or national legislation or local standards that prohibit or restrict advertising of certain products to minors – including, for example, alcohol, gambling, sugary drinks, or age-restricted content. These sector-specific rules vary across jurisdictions and must be taken into account when designing advertising aimed at, or accessible by, children.
The Digital Services Act (“DSA”) prohibits online platforms from targeting advertising at children based on profiling (as defined in Article 4(4) GDPR) which uses their personal data where the online platform in question is reasonably certain that the recipient is a child. This has been confirmed by the European Commission in its draft Article 28 Guidelines.
In addition, Recital 38 of the GDPR notes that the specific protection which children merit with regard to their personal data particularly applies in the context of the use of personal data of children for the purposes of marketing. In its 2013 Opinion on Apps on Smart Devices, the European Data Protection Board’s (“EDPB”) predecessor, the Article 29 Working Party, stipulated that, in the best interests of the child, organisations should not either directly or indirectly process children’s personal data for behavioural advertising purposes, as to do so would be outside the scope of a child’s understanding and therefore constitute unlawful processing.
The EDPB has reiterated this principle in its Guidelines on Automated individual decision-making and Profiling for the purposes of Regulation 2016/679, in which it states that organisations should, in general, avoid profiling children for marketing purposes, due to their particular vulnerability and susceptibility to behavioural advertising.
Further, Article 6a of the Audiovisual Media Services Directive (Directive 2010/13/EU) (“AVMSD”), as amended, provides that personal data of minors collected / generated by media service providers for the purposes of complying with the separate obligation to protection children from audiovisual media services which may impair their physical, mental or moral development, not be processed for commercial purposes, such as direct marketing, profiling and behaviourally targeted advertising. These rules have been transposed in national laws concerning the regulation of audiovisual media services across Member States.
At EU level, the European Commission has launched investigations under the EU Digital Services Act into a number of online platforms concerning, among other things, advertising practices likely to exploit minors’ vulnerabilities, including the design of algorithmic systems that may foster addictive behaviour. For further information , see the response to this question.
At national level, various audiovisual regulators have taken action under transposed Audiovisual Media Services Directive (AVMSD) provisions.
Additionally, the Consumer Protection Cooperation (CPC) Network coordinated enforcement against an online game, due to misleading advertising targeting children. This resulted in changes to its commercial practices. These cases illustrate increased scrutiny of advertising to children across the EU, with sanctions ranging from binding commitments to administrative fines.
Moreover, in 2025, the ICPEN Sweep revealed widespread manipulative design practices in mobile and online games targeting children, including sneaking, nagging, and obstruction techniques. These practices are already prohibited under the UCPD where they amount to unfair commercial practices, and their identification has reinforced the EU-level push for stricter future regulation, including under the forthcoming Digital Fairness Act.