Dashboards: A conversation with Nick Drew of Fuse Insights and Sean Dixon of Omnicom Media Group
March 27th, 2019 | ACA Team, Association of Canadian Advertisers
- What’s the biggest challenge facing the marketer as they develop a dashboard?
- Give the listener one “must do” or one “don’t do”
- Do people have a pretty good understanding of where the numbers come from and their definitions?
- Is attribution modelling too short term focused?
- Is market mix modelling too cumbersome?
- Why do dashboards fail?
- The focus on short term metrics has been blamed for brand destruction, are real time dashboards to blame?
Nick Drew: It’s being drowned in data. As we said in the whitepaper, there is so much data available that could conceivably be relevant; and you can keep adding and adding, in a “oh, just this one more piece, because we might need it” way until you end up with something utterly unworkable that doesn’t do what’s needed.
The other danger is related, and is being distracted by the data (or the visualization), big numbers often feel like the goal – “this number is bigger than that one, therefore it’s better”. The goal of the dashboard is to enable actionable insight and that might come from a mistake.
Sean Dixon: I agree, and think that it underscores the need for a plan. A dashboard should be a mechanism to evaluate your strategy. If you’ve done the necessary step of identifying your top-level objective, every decision to include a metrics should flow from the top down. Without a Strategy, there’s just too many metrics to choose from and you can lose sight of meaningful measurement.
Nick Drew: You must start with a clear vision of why you’re doing this – and not just a “because we want to be smarter”, but a clear objective, that “this dashboard will achieve ____; by showing us ____”. That’s your north star, essentially, as you build the dashboard, and as you consider which metrics to include, what to do with them etc. And you’ll probably find you come back to it over time; it’s a good way to assess whether your dashboard is good enough, by questioning whether it’s hitting that objective.
And you mustn’t start by imagining that the data will show you the right dashboard to build – you have to bludgeon the data into submission!!
Sean Dixon: I guess we won’t have too much in the way of opposing viewpoints. I might add/repeat design before you build, not after.
Nick Drew: I think my answer would be “it depends”, but that it’s also important to understand how much you need to know. Do you need to know the exact panel size comScore has – probably not. But do you need to know the potential biases and how to compare impression counts from Facebook with MOAT figures – absolutely.
There are two main points where this is a vital question. The first is looking at the data you need versus the data you actually have, and trying to fit the latter into your vision At that point, you really need to understand what the number means so that something isn’t lost in translation.
The second is when comparing two sources for what seems like the same proof point, for instance comparing impressions from a platform versus impressions from 3rd party ad verification from the same campaign. Whether a number is “right” or “wrong” is somewhat moot at this point – what’s more important is whether they’re right or wrong for what you need. Sean made a really good point that a platform’s numbers can be 100% correct but still be pointless for you – it’s simply showing how many impressions you paid for, not how many people saw them, or how many were interacted with. Again, knowing what the numbers show, and what they mean, is vital here.
Sean Dixon: Attribution modeling always includes some amount of assumptions, whether that’s in the nature of the data, the tactic or thing being measured, or even model fitting. As long as those assumptions are considered, adding context to model outputs, the time frame for attribution modeling should be appropriate to those limitations. For an overall business view, yes, I think the focus on MTA and immediate or near-term decisions around MMM & MTA does tend towards being too short term.
Nick Drew: It is very cumbersome – particularly in the gathering of data over the required period. It’s limited in what it achieves: it isn’t a silver bullet that magically tells you what mix to use, and it’s totally built on what’s happened before (it’s not a guarantee of what will happen in the future). And it’s expensive. But it also provides a level of insight and guidance at a macro level that can’t really be achieved with current tech through other means.
Sean Dixon: Totally agree again. There’s also the issue of how reductive the modeling has to be. It can scale up through nested models, sure, but so much of the strategy and nuance of the roles of media, different objectives, etc. has to be boiled down to dollars or impressions, and when MMM is modeled against outputs that weren’t campaign objectives, there’s real risk using MMM as the gospel as to how your budget should be allocated. As Nick mentioned, it’s always looking back, and external factors can really disrupt the findings when applied moving forward. There’s tremendous value in MMM, it can help justify spends and depending on the scale and complexity, demonstrate the impact certain variables have on business goals, but it’s an ingredient in effective advertising, not the whole recipe.
Nick Drew: Scope creep and/ or imperfect scope definition. When you have a clearly defined framework of “this dashboard will…. By showing ….”, it’s relatively straightforward to build and maintain to those specs – even making the necessary compromises. A vague start, or trying to add elements along the way, while not a surefire recipe for disaster, reduces the efficacy of the dashboard, and its relevance.
Sean Dixon: Dashboards are a platform for insight. Presenting information that doesn’t provide for insight and action makes it irrelevant and frustrating. Action comes from making them part of workday as status controls, holding responsibility, uncovering mistakes to reflect upon and celebrating team milestones. One big question is who gets to see the dashboard?
Nick Drew: Marketing theory is important, and as Byron Sharp puts it, balancing the right funding ration between branding vs response-led campaigns should not be overlooked. It’s vital context as you map out brand objectives, what you need from a campaign, and over how long a timescale you should be judging success. That influences what’s on brand dashboard and what’s on the response dashboard.
*You must be an ACA member to access this content.