Scroll down to book a 30 minute diagnostic call for free

33 Ubi Avenue 3, #08-13 The Vertex, Singapore 408868

Case Study: Creating a data-driven campaign strategy

I was recently asked to deliver a campaign strategy for a large company here in Singapore. We decided to take a data-driven approach and here I want to break down how I did that.

Step 1: Form some theories (human-centred design approach)

We had access to a detailed post-campaign report that I went through systematically and in detail. This included detailed target personas, which are always useful to have. This is a good place to look at the company SWOT analysis and other strategic documents if you have them available to you. Any campaign strategy should map to marketing strategy, which should map to organisational strategy. In this case, our post-campaign report consisted of qualitative data (e.g. comments people made), quantitative data (e.g. engagement rates), and personas. I looked for insights into what had gone well and what had caused friction.

Insight = need + barrier
“I want… but…”

I then grouped these insights around themes, for example, the environment, financials, design. From there I tried to go deeper and find an underlying insight or barrier that was causing the other problems. This required me to take a psychographic perspective on the target audience.

Once I had a few really strong ones, I needed to stress test the theories in the external world.

Tools

Challenge: Getting to the bottom of the insights and trusting your gut.

Step 2: Stress test the theories (analytical approach)

At this stage, I sorted through all the reports I could find on that target market. As far as possible, I chose reports with large sample sizes that had been statistically treated – this is surprisingly challenging. I then read through the reports with my theories in mind and looked for clues that they were at play in this context too. Every so often I paused to see which theories needed to be discarded and which were still showing up strong. This kind of desk research can take a week or so.

Tools

Challenge: Finding reports with comparable information that had been treated statistically – you usually end up with a patchwork of good studies.

Step 3: Test your chosen theory direct with the target market (using Bayes Rule and Facebook)

By the end of all that reading, I had a good solid theory to test. At this point, I needed to go to the market and ask them.

This is a step many people miss out and it is crucial, especially in mass marketing. If you see campaigns falling flat, it’s usually because they missed this step and an internal committee came to a consensus instead. In mass marketing, you are talking to a diverse group of people with brains that process information differently. You have your own way of processing information too. Testing the messaging strategy widely is the best way to reduce the chances of a flop.

In this case, I worked with a partner called The Tenth Floor, who specialises in testing data-driven marketing theories using Facebook ads.

Choosing the exact wording to test

To assess the exact wording we wanted to test, we scraped Google news. I tested the company’s name against 80 different keywords and compared their profile across those key words against 15 different competitors. I then settled on about 10 keywords that aligned with the company's messaging columns, expressed are message, and profiled well against competitors.

Subhendu at The Tenth Floor then scraped Google news for those keywords, the company name, and restricted the search to the most trusted publications across the markets we wanted to address. The data was visualised in Tableau. As is often the case, the news items showed a Pareto distribution – a few of the articles got the majority of the reads.

Designing the experiment properly

Sub and I then went through the news and came up with about 40 positively-worded statements that would express the messaging strategy in different ways. We needed to pick 12 to test through Facebook ads with our target market. Whichever statement got the most engagement would be the one that resonated with the target audience.

I trust investment companies that consider the impact of their investments on the community (positive)

As opposed to...

I don't trust investment companies that only think about profits (negative)

Why positive statements? Because we are interested in what compels our audience to act, not what compels them not to act.

However, theories are never 100% right, they exist on a spectrum. If the theory is like a set of weighing scales, then the evidence can be placed on the “true” side of the scale or the “false” side of the scale. Each statement could be treated as evidence to put on the scale in the true or false category - the more engagement with the statement the more weight we could give it. The proper weighting of evidence is governed by something called Bayes Rule. Bayes Rule is used by people like statisticians, who work with numbered weightings for each piece of evidence, and doctors, who test their theory about what’s wrong you by treating your symptoms as weighted evidence.

If we use Bayes Rule in a non-statistical way, we can ask ourselves this basic question:

Bayes Rule
How likely am I to come across this evidence if my theory is true or if it is false?

We decided to use Bayes Rule to test our theory:

  1. We chose 4 statements that, if they got a lot of engagement, would prove our theory was true.
  2. We chose 4 statements that, if they got a lot of engagement, would prove our theory was false.
  3. We chose 4 statements that tested a basket of other issues, such as markets of interest.

How did we use Facebook ads to test with the target audience?

We then put these 12 statements into 36 Facebook ads. Age was important in this particular target market, so we tested the age group we wanted to target, the age group above, and the age group below – this provide comparative information.

The ads were given exactly the same picture to ensure that the variable was controlled. When someone clicked through the statement, they would be presented with a screen with 4 company logos and a “none” option and asked which company they most associated with the statement. This is useful because we were able to discover whether or not the company we were working on was competing with other companies for the same mind space.

Analysing the data

We used Tableau again to visualise the data. This allowed us to split it by keyword and by age group to look for comparative insights. We also plotted the company mindshare on a radar graph so we could clearly see the space each brand took up.

Tools:

Challenge: Agreeing on the correct wording of statements and competitors to test against.

Our results

In this case, we proved our theory was true and that the company we were working on was the most associated with those messaging points. Of our 12 statements, 3 had a lot of engagement and 2 more had some engagement - a pattern that also followed the Pareto principle. We were also able to show how the target audience differed from the lateral audiences in their reaction to the statements.

We also discovered a big whitespace they could move into and own if they pushed the message further. We were then able to construct a marketing campaign strategy around robustly tested evidence.

********

If you would like more insights like this one, please follow me on LinkedIn.