Category

Strategy

Date posted

21 Nov 2023

Read Time

Related Articles

Strategy

We’re a Badged Measurement Partner for Meta

Strategy

Three steps to building a lasting agency-client relationship

Strategy

Marketing mix modelling and the future of marketing measurement

The digital landscape of today presents marketers with a variety of challenges. How can we navigate a future of cookieless measurement and attribution? How can we keep up with continuous browser changes?

Doing nothing is not an option. We need measurement and optimisation solutions that ensure the continued effectiveness of digital media, without compromising privacy.

The answer: marketing mix modelling (MMM). 

While MMM has been around for some time, a modern approach that honours user privacy is key to preserving and improving the effectiveness of your marketing.

The three eras of marketing effectiveness

To understand the future of measurement and attribution in a cookieless and privacy-centric digital world, we first need to look at how marketing effectiveness has evolved. We can split this up into three broad eras.

Pre-internet: self-declared attribution and “traditional” marketing mix modelling

In the early days of advertising, marketers could collect a tally of where people had encountered their ads. They could use this to make informed decisions about where to place more ads and which publications to partner with.

Skip ahead to the 1970s, marketers started asking how they could build a statistical model to predict future sales. This is known as “marketing mix modelling”, which uses data such as historical marketing spend and seasonality to predict how the market will react to future investment.

Back in the 70s, media would typically be bought six to 12 months in advance. The market arguably moved slower than it does today and across fewer channels. So it wasn’t unreasonable to expect that data from years ago would still be valuable for predicting the performance of next year’s marketing.

Birth of the internet: multi-touch attribution

After the internet came along in the 1990s, marketers no longer had to build a model to predict where their sales were coming from. Instead, you could observe it in real time and measure every customer interaction.

Identity-based attribution models emerged, in which customer journeys are measured at the individual level. The simplest models would apply predefined rules to customer journeys to determine how credit should be awarded to each of the touchpoints in the journey. For example, that the last click should receive 100% of the credit, or each touchpoint should receive equal credit. But it wasn’t long until marketers realised that this approach is hugely subjective and significantly oversimplifies the complex reality of how customers behave online. In response, data-driven attribution techniques were developed in an attempt to remove the subjectivity from digital attribution modelling.

However, decades of using these multi-touch attribution models means we’ve become so accustomed to them that we sometimes forget about their limitations. Multi-touch attribution:

  1. Doesn’t measure the incremental value of your marketing.
  2. Assumes digital touchpoints are the only things that influence digital performance.
  3. Can only ever be as good as the quality of your data.

Today: user privacy, browser restrictions, and new techniques

The general direction of travel globally is towards more legislation to protect the privacy of consumers. As marketers, we must collect user consent to analyse and process their data. In addition, web browsers are implementing more restrictions that go far beyond just the deprecation of third-party cookies. This means that our ability to measure complex user journeys online is being eroded, which in turn diminishes the accuracy and value of multi-touch attribution models.

However, new technologies have been released to help ensure we can still optimise the effectiveness of digital campaigns. These include:

  • Custom audiences – uses hashed PII from consenting users to create audiences and lookalikes that can be targeted in media platforms, without the use of pixels.
  • Server-side measurement – data is sent to ad platforms from a server-side system, rather than directly from a user’s browser via a tracking pixel. This can help mitigate the effect of ad blockers used by consenting users.
  • Advanced matching – uses hashed PII to better attribute conversions from consenting users to the campaigns that helped drive them, even when the user’s cookies have been cleared.
  • Conversion modelling – a process in Google Ads and GA4 in which Google aims to estimate the source of conversions from your non-consenting users by training a machine learning model with the data from your consenting users.

Implementing these is important in order to preserve the effectiveness and measurability of marketing performance. 

Moving beyond digital attribution

Brower changes, user privacy choices, and the dwindling efficacy of cookies mean it’s becoming increasingly difficult to accurately track user journeys through to conversion. As marketers, we need a new solution.

To truly move beyond multi-touch attribution, we’ve got to switch our collective mindset. We often think about conversions from the bottom up. In other words, we look at a conversion and ask “where did it come from?”, working backwards to try and identify all the touchpoints that occurred in the days prior to the conversion. However, this approach requires identity-based attribution.

Instead, we should approach the problem from the opposite direction and ask: “What are the factors that contribute to my business’s total performance?”.

This is where modern MMM solutions like Meta’s open-source framework, Robyn, come in. Much like data-driven attribution, modern MMM solutions aim to reduce the subjectivity and improve utility when compared to more traditional approaches by:

  • Automating data preparation and incremental model updates
  • Favouring recent data when modelling marketing factors, while retaining long-term data for seasonality
  • Using modern processing power to test 100,000s of models to find the ones that objectively perform the best
  • Making use of more granular channel breakdowns

RocketMill’s approach to MMM

Robyn is our preferred framework for developing MMM. It gives us the tools to be transparent in how we measure the effectiveness of our clients’ marketing. It also helps us provide our clients with a means to completely and identically reproduce the models that we’re building to remove any perception that there could be bias in the modelling.

But Robyn itself is just one small part of the puzzle. It takes time to gather, clean and prepare your data before you can train a model. Once you’ve built the model, you also need a way of deploying to your end users in a meaningful way that they can understand. Finally, you need a system to consistently refresh and update your model periodically.

This is where we can help. We’ve created data pipelines, data modelling processes and other technologies that allow us to build, train, deploy, and refresh models faster than ever. We’ve not lost sight of the end user either – we’ve created a dashboard that clearly displays the size of the opportunity and how to achieve it in a way that’s quick and easy to understand for marketers that may only have experience in multi-touch attribution solutions.

To read more about how RocketMill combines cookieless and identity-based solutions to help optimise your media, download our connected measurement eBook. You can also watch Rhys’ talk at our latest event with Meta below:

If you’d like a demo of our Marketing Mix Modelling solution, please contact us.