Roll up your sleeves, nerds. We recently sat down for a Q&A with LSB’s Director of Digital Strategy, Leanne Johnson, to discuss digital campaign measurement—including what people usually get wrong.
Q: What exactly is digital campaign measurement?
The way we do it at LSB is ask questions before we measure anything. That’s true whether a campaign is exclusively digital or if it has offline channels—and honestly even those offline channels lead to online in most cases. Because what we’re always trying to do is solve the business problem.
So in order to get to that, we do a lot of discovery upfront to understand what success looks like from a business perspective. We’re asking the client questions, then we do a lot of work to build out a framework that has success metrics for each channel. We make sure those metrics roll up into a picture of how you measure overall success, both from a marketing perspective and for the business.
Q: What does that upfront discovery look like?
For many clients, it can be difficult to pin down the true business objective. The fact is, there are generally a lot of different stakeholders, and a lot of people think different objectives should be the top priority simply because of their different roles. This is also where the measurement framework comes in handy, it shows how all of the different objectives can work together and where they fit in the hierarchy.
So we start that conversation in what we call our Jumpstart day, and then usually we have follow-up conversations to further probe. We ask questions, take that information from the client and then reflect it back to make sure we’ve captured an accurate picture of their needs before we undertake digital campaign measurement. These steps are critical to ensure alignment and that we have an accurate view of what success looks like. Then we can take that business objective and work to translate it into a usable campaign objective (s).
Q: So once you have your digital measurement plan mapped out, are you done? You just have to track and report from there?
Hahahahah. Not at all. LSB’s philosophy is “always on” when we have a campaign in flight.
For us what that means is we’re always reviewing campaign data and analytics and looking for trends, insights and natural optimizations. We are always looking for things that are actionable.
That’s where lower level micro success metrics help us determine where there are areas of opportunity to optimize the campaign to garner the best performance and success for the client.
Q: For example?
I trot this example out quite a bit, but I think it’s telling.
A while back, we had a jewelry insurance client, and they were targeting people who were newly engaged because that was an area of opportunity for them in the past.
But we had the opportunity to do a very small test and look at the post-wedding consumer with one specific email marketing campaign. So using the measurement plan we had outlined, with business goals and success metrics per channel already defined, we were able to review the email marketing campaign data compared to the engaged/pre-wedding consumer. We saw after four to five instances that the conversions were better for the post-wedding consumer—and the difference was statistically significant.
After that, we built out a deeper test. We tested the new target in social to further vet, and eventually that actually became a new and very successful target of focus for them, so we recommended more dollars be placed there.
It was kind of an insight hiding in plain sight. Like, of course, the engaged-to-be-married target is consumed with their wedding, but post honeymoon, they’re ready to get their act together, do their paperwork, name changes, the whole package—including insurance for their new jewelry. But all of that became apparent because of in-flight digital campaign measurement.
Q: How do people get digital campaign measurement wrong?
A few different ways.
They don’t have that initial alignment on what success looks like, so sometimes you can have big disconnect between what the agency is reporting back to the client and what the client’s expectations are. So the client sees that reporting and they’re saying, “that’s not what success looks like for us.” We’ve even experienced it here.
Also, because digital still sometimes gets siloed, the reporting on the different channels is siloed, too. There should be more cross-channel testing. We’ve seen some nice insights come out of using search and social at the same time with similar messages and similar focus, that we’ve been able to take those learnings and then blow them out to channels with longer lead time.
I’ve also seen other agencies do a nice job of defining what success looks like by channel, but they don’t take that a step higher to set in place the larger objective or a step lower by anchoring those KPIs to meaningful user segments. You want to know what the true campaign and business objectives were, and you need to ladder all those things up. Because the channel can be successful by itself as defined by channel metrics, but if they don’t tie to the campaign objective and the business results, then you question whether that was really successful.