How can you possibly optimise over 300 digital ads?

The complexity of digital campaigns continues to spiral. Smart analytics and a measured degree of human intervention can bring order to the potential chaos.
17 March 2021
Behavioural metrics such Likes, Comments, Shares, and Completed Video Views are useful metrics
Duncan Southgate

Senior Director, Global Creative, Kantar

Olivia Trozze

Director, Digital Analytics

Get in touch

Digital ad campaigns can get very complex. A relatively straightforward campaign can quickly multiply to a large number of ad permutations. An advertiser may run a campaign with 3 ad formats (photo, short video, long video) across 4 media platforms (Facebook, YouTube, Twitter and TikTok). Within each platform, the campaign may target 5 audiences (perhaps brand users/non-users, or demographic, interest-based or in-market user segments). As for the actual ads, a campaign may contain 6 or more different creatives (e.g., visual concepts or message themes).

So, with 3 ad formats, 4 media platforms, 5 audiences, and 6 creative units, the campaign reaches 360 cells. And the optimisation challenge begins! Even this is a gross simplification that ignores device, page placement and sizing issues. Or the infinite possibilities introduced by personalised creative, or dynamic (e.g., geo or weather-generated creative units). The more intricate a campaign strategy, the more challenging it becomes to manage optimisation at scale.

Two approaches to digital campaign measurement

Since the dawn of the internet, the most common campaign effectiveness method has been to measure brand lift of the entire campaign, and then sub-analyse as much as possible. This tried and tested method (conducted over 20,000 times by Kantar) shows if one platform is more impactful than another. Or if one creative theme works better. But it doesn’t allow us to understand the interaction of the two effects, and mid-campaign optimisation is limited (e.g., 4 weeks into an 8-week campaign).

Recently we have seen lots of marketers measuring whether the lead ads in a campaign work well in their primary contexts; for example, assessing brand lift for three Facebook ads and three YouTube ads. Rapid, automated ad testing allows you to quickly evaluate if the campaign is working as intended, and to course correct if not. This cost-effective approach is commonly applied for 4 to 10 cells in a campaign, and occasionally for as many as 40 or 50 cells.

However, even with these advances, advertisers or their agency can’t justify gathering survey-based impact metrics for every format/environment/audience/advertising combination in a 360-cell digital campaign.

Can you hand the entire job to the machines?

In-platform auto-optimisations are good at some things like maximising reach or optimising clicks. But the result is based on the platform’s assumption that advertisers always want the cheapest media, regardless of the end goal. Given that assumption, the algorithms identify and serve ads to people who meet the lowest common denominator. Optimising for reach means your ads go to the people who spend the most time on Facebook; maximising clicks effectively targets people who click on lots of ads and perhaps may have clicked whether it was impactful creative, or not.

If you want to make a real impact on audience and business goals, these out-of-the-box optimisation algorithms aren’t enough. Today’s sophisticated advertiser needs to leverage their behavioural data independently, to provide the checks and balances that ensure their media is working harder to make a difference.

The demands of complex campaigns

The best approach to comprehensive data-led optimisation is to overlay a structured analytical intervention and a review of behavioural metrics on top of automated systems. Testing lots of creative versions is a good idea. Variety in creative, copy, call to action across targeted audiences and platforms allows you to try out lots of different combinations in the crowded advertising marketplace. Strategising and trafficking all of your creative/copy combinations is a great start, but then you need to act quickly on the feedback consumers provide.

Our experience shows that it’s more important to cull the weak permutations than to double down on the best. Sounds similar, but it’s different. If we pick the top few ads to run with, eventually we run into the same problem of not meeting the nuances of the specific advertising context or accelerating ad fatigue. Rather, we encourage maintaining variety, and acting swiftly when something isn’t working. Poor performance doesn’t improve with time. It wastes valuable impressions and media dollars. As the saying goes, the definition of insanity is doing the same thing over and over, and expecting different results.

Of course, any optimisation system is only as good as the metrics it is being optimised against. Behavioural metrics such Likes, Comments, Shares, and Completed Video Views are fantastically useful metrics because they are widely and speedily available. But one size does not fit all, and careful consideration needs to be given to selecting which metrics best reflect campaign objectives. The smartest brands embark on broader validation and calibration exercises to link these metrics to their ultimate brand and sales building objectives, or to Artificial Intelligence (AI) predictions of ad success.

Further, how you analyse the metrics is really important. Interaction rates vary dramatically by ’cell’, so sorting ads into like-for-like groups by format and context is key. Without smart analytics, overly simple manual optimisations can actually do more harm than good.

Optimise with confidence

Kantar’s Digital Content Optimisation leverages sophisticated analytics and a simple oversight routine to enhance basic media agency optimisations. Its analytics-backed content performance evaluation empowers agencies to act quickly with independent and objective assessment.

Take this real case study: a major US corporation applied Digital Content Optimisation to their brand-building advertising campaign, running 320 ads for 8 weeks. We partnered with the corporate team to develop a custom qualified engagement metric for optimisation. Over the course of the 8 weeks, our model ran each week and the media agency implemented the cuts at the same cadence. By the end of the campaign, performance for the core KPI had increased each successive week, achieving a +263% increase overall.

Even the simplest digital campaigns are increasing in complexity with new platforms coming online, advanced targeting capabilities, and ever more creative variations. Current solutions for managing creative optimisation at scale only partially meet advertisers’ needs. Smart analytics and a measured degree of human intervention now mean advertisers and media agencies can implement swift in-flight content placement decisions to impact business goals. This allows brands to take more risks and run more tests with confidence.

Get in touch to find out how to optimise your digital campaigns with smart analytics.

Get in touch
Related solutions
Optimise your media mix across channels to deliver impactful campaigns and improve return on investment.
Maximise return on your digital investments. Learn which campaign elements work best for your brand in different contexts.
Analyse which paid, owned and earned touchpoints have the greatest brand impact, and how they work together, to optimise your media and marketing spend.
Kantar Marketplace media planning and effectiveness

Media planning and effectiveness on Kantar Marketplace

Know which campaign elements work best for your brand in different media contexts, so you can get the most out of your budget.
Learn more