Blog

Protecting Children from Harmful Advertising: A Marketer, Creator, and Mum’s Perspective
I recently attended an ASA webinar on protecting children from harmful advertising. We heard from teenagers, the NSPCC, regulators, and industry leaders. And I realised I’m part of the system.

I create online ads for clients. I make budgeting and placement decisions. And I’m also a mum teaching my four- and eight-year-old that adverts present things in specific ways because they want you to do something.

It’s tricky to navigate. And after attending an ASA webinar on protecting children from harmful advertising, I’m clear that I need to do better. Here’s what I learned and what I’m committing to change.

What Young People Are Actually Saying

The NSPCC’s Voice of Online Youth, a group of 14-17 year olds from across the UK, has been developing a manifesto for change since April 2024. They identified five priorities for a safer digital world. One of them is to reduce the harmful impact of online advertising.

Here’s what they’re calling out:

  1. UX Dark Patterns
    “Should be easy to turn off personalised ads without making the app harder to use. I tried to turn off ads in YouTube and stopped seeing YT shorts. This makes people feel forced to turn them on.”
    They’re identifying that platforms use friction as a retention tool, making privacy choices so inconvenient that consent becomes coerced.
  2. Blurred Lines Between Ads and Content
    “Not clear as to what is opinion vs advert.”
    Influencer marketing, sponsored content, native advertising – the boundaries have dissolved. For my kids, I can point at a TV ad break. But on YouTube, TikTok, Instagram, the content often is the advert.
  3. Exploitative Targeting
    “Ads prey on people’s insecurities.”
    Algorithms know what makes people vulnerable (body image, social anxiety, FOMO) and serve ads designed to exploit those exact pressure points.
  4. Dangerous Content
    “Make sure ads aren’t promoting dangerous topics.”
    Self-harm, eating disorders, gambling, unrealistic body standards… young people are calling it all out.
  5. Age-Appropriate Regulation
    “Regulate what children and young people can see online. Children see things meant for adults.”
    Age gates are laughably easy to bypass. Adult-targeted ads routinely appear in feeds accessed by children. The infrastructure itself needs to prevent exposure, not rely on an honour system.

The Regulatory and Industry Response

The good news is there’s real momentum…

ASA’s AI-Assisted Monitoring

The ASA is now in year three of a five-year strategy using artificial intelligence to proactively detect and prevent irresponsible ads online. With API access from Google and Meta, they’re capturing around 40 million online ads per year and running them through AI models. Rather than reactive complaint-handling, they’re hunting for violations.

The 100 Children Report

The ASA’s 2022 study found that 93% of 11-17 year olds had social media profiles, and at least 11% of accounts were falsely registered with a date of birth showing 18+. There were 11,500 ad impressions captured with 435 (3.8%) for age-restricted products (alcohol, gambling) and 73 impressions were clear breaches served to children registered as under 18. 261 impressions were served to children who claimed to be registered as 18+.

The “Seatbelt Moment”

Katie Streten from SuperAwesome called this our seatbelt moment, as brands are no longer debating whether protections are needed and now it’s about implementation.. The NSPCC study with Boringa proved that investment in kids’ safety online is good for business.

As a Marketer: Assume Children Will See It

It struck me during the webinar that a considerable part of the discussion was around verifying children’s ages, yet I’ve never needed to verify my age to access content online. Are we putting the emphasis in the wrong place?

Instead of asking “Is this child really 13?”, maybe we should be asking “How can we ensure adult-only content stays in adult spaces?”

In the absence of that issue being resolved any time soon, my principle moving forward will be to assume children will see whatever we put out online. If it’s not appropriate for them, it’s our responsibility as advertisers to do all we can to protect them from it.

We can also consider ad placements and the age verification methods used by platforms beyond self-declared age. Are they using layered targeting (behavioural signals, age estimation AI, device data)?

Where is my clients’ ad spend actually going? Which environments are we funding? How do they evaluate platforms for child safety?

Global ad spend will exceed $1 trillion this year. As Suzy Ryder from OMD UK said, “Brands don’t just buy media, we shape the online world. That comes with responsibility.”

As a Creator: Words Matter, CTAs Matter, Clarity Matters

During the webinar, the ASA shared case studies of advertisers who got it wrong:

  • A Jägermeister competition that required users to “like and share” to enter, pushing the ad onto feeds of under-18s
  • Moshi Monsters (a children’s game) used commanding language: “Members get more missions… JOIN NOW” and this was deemed a direct exhortation to children
  • A gambling ad featured a Premier League footballer with high appeal to under 18s, without watertight targeting

These advertisers didn’t think through the implications of their creative choices. One tool we can use to avoid these kinds of mistakes is using the ASA’s Copy Advice service before campaigns go live. It’s free, confidential, expert guidance.

We should also be vetting influencer partners for child safety practices, not just engagement rates. Think carefully about whether children should appear in content for age-restricted or sensitive products.

Another thing we discuss and test a lot as ad creators is CTAs, but we don’t consider enough our responsibility to avoid exploitation of insecurities. Even “soft” CTAs can prey on vulnerabilities. The language we choose, the pressure we apply (or don’t), and the imagery we select all have impact.

As a Mum: Education is My Responsibility (But I Can’t Do This Alone)

I’ve lost count of the number of YouTube videos my eight-year-old has watched that say things like “Give this video a thumbs up or you’re not a good person” or “Only people who care about this poor orphaned kitten will subscribe.”

We’ve had to have deep conversations about why those channels want him to take those actions, and why they’re trying to fool him into doing it.

As parents, education is our responsibility. We cannot rely on the protections in place that tell us our kids are in safe online spaces. We need to teach them how to spot an advert, what adverts are, and why they shouldn’t just do whatever they’re suggested to do online.

But parenting can’t fix systemic design flaws. When platforms use dark patterns to make privacy choices inconvenient, when algorithms exploit insecurities, when age verification relies on ticking a box, no amount of kitchen-table conversations will fully protect my children.

So do we ban them completely or cross our fingers and hope education is enough?

What gives me hope is that 97% of teens stay in stricter settings when given good tools. Young people want protection when it’s well-designed and user-friendly. Media literacy is also becoming mandatory in the UK school curriculum from 2028. In our household, we will be making media literacy part of daily life.

Resources and Next Steps

For Marketers:

For Creators:

For Parents:

General:

If you would like to watch the webinar recording, you can catch the full replay on the ASA’s YouTube Channel.

The Path Forward

Young people want better protections and the technology is improving, but we are still a long way off. Rather than focus on how frustrating the situation is, I’m going to focus on what I CAN do as a marketer, a creator and a parent to protect children online.

As a marketer, I’ll demand transparency and make child safety a criterion.

As a creator, I’ll use expert guidance, vet partners carefully, and ensure my ads are unmistakably labelled.

As a mum, I’ll continue to teach media literacy and advocate for systemic change so I’m not the only line of defence.

When we protect children, we protect the integrity of advertising itself, and that benefits everyone. Whether you’re working in this industry or raising kids in a digital world, or both, we all have a role to play.

More Blog Posts

A 3-Step Guide To Speedy Content Creation

A 3-Step Guide To Speedy Content Creation

Do you find yourself dreading your next social media post? Or maybe you’ve got plenty of ideas but can’t work out how to organise them? Follow this three-step guide to speed up content creation…

read more