When is perfect not good enough?
As marketers we are striving to select the best audience, timing, channel to allow us to get the best results. Sometimes we can focus too much on getting everything perfect. So why should we bother?
Instead, why wouldn’t we just do things which are nearly there and learn by doing rather than over-analysing?
This is not about doing less or being slapdash, but more about thinking about when you need to go the extra step to make the campaign profitable.
We all are aware of the 80/20 rule whereby we try to get 80% of the value from 20% of our audience. However with more and more data, and ways of accessing and analysing this data, it seems as if paralysis by analysis has set in and we are unable to act quickly.
In fact why wouldn’t we apply the 80/20 again and get 80% of the way there in 20% of the time? This way it might not be the most perfect analytical model for selecting an audience but at least we would have created a campaign, which is still targeted, and it would have gone out quickly to our customers. We can then learn from what worked and didn’t work.
Isn’t everyone doing this today?
The following stories might be very one-sided, but I wanted to illustrate what I have experienced:
Analytical Throttling: I met a large telecoms company in Canada. We discussed productivity and how we could improve the efficiency of the data team, as the feedback from the marketing department was things were taking too long to turn around. The head of data said there was no need to improve efficiency. He was in a position of power and had a team of 40 people. If there was a bottleneck he would just employ more people and therefore he would have a bigger team and be able to command a higher salary.
Analytical costs: Working with a UK media company, we were trying to plan smaller tactical campaigns. We were unable to execute campaigns due to the large cost of the data team. We wanted to test some very simple creative messages, yet once we had briefed the data team there was little or no budget left for the creative, making the whole exercise pointless.
These might be one-offs but surely there is a way in which we can use analysis and data in a lightweight way?
But won’t doing less analysis mean the campaign will have negative return?
That might be correct in certain circumstances but not in all cases. Take the following scenario:
I have a campaign targeting 100,000 customers. I have a budget for data analysis but want to try and get the best result for my investmen t; not an uncommon theme.
Looking at the 20/80 rule this is what happens:
Now to validate if it is worthwhile spending 80% more time getting the final 20% of value we get the following outcome:
In this example the numbers were relatively low and the average order value low too, but it would have been worthwhile had these numbers been different.
This is meant to be illustrative, but it does highlight that getting nearly there shouldn’t be overlooked.
So, before starting with a big analysis project, see if it is possible to execute the campaign more quickly, considering:
- If it can be done with 80% of the analysis done
- If the costs are worth the model uplift or benefit
- Whether the speed to market is more important than overall accuracy