Best practice & challenges for measurement – an interview with DataScouting

Awards organisers demand measurement, but…
31st March 2016
We must be legit – we’ve issued a press release!
12th April 2016

I was recently asked by Sophia Karakeva CMO at DataScouting to give my views on a number of topics facing the measurement industry – the original train-chicago_69855_990x742is here. I asked Sophia to provide a short introduction for this edited version:

Mike Daniels, Principal at The Measurement Practice, shares in this post his views about how we can achieve best practice for performance measurement; he explains why the combination of research, communications and technology expertise is a real challenge for measurement, and comments on the latest industry criticism around AVEs sparked by Meltwater’s white paper that was promoting AVEs for PR Measurement.
As a FIBEP Technical Associate Member, DataScouting, a software company specialising in media monitoring software solutions for print and broadcast, believes TMP could be a good partner for FIBEP members providing measurement training or consultancy support and strongly believes this interview can be the start point for a muturally benificial cooperation.
I am especially pleased that TMP may be able to collaborate with FIBEP and its members – closer co-operation between the measurement and monitoring communities was a priority during my time as Chair of AMEC. Below is the interview – minus a chunk about The Measurement Practice, its approach and service offerings – all of which are already detailed on this website…

Q: How to achieve best practice for performance measurement? MD: In my view, best practice stems more from ensuring a culture of research rigour and from understanding a client’s real needs rather than any one methodology being “better” than another.

There is no single one-stop definition of best practice. Rather, it resides in following certain key processes – not all of which will be necessary or relevant to every measurement client.

As a first step, it’s very important that clients understand that for any given measurement requirement, there are three critical components to balance:

fast-good-cheap

This diagram reflects how organisations can benefit from any two of the three drivers, but not all three at the same time. So, you can have a high quality provider delivering results in the shortest possible time, but it’s going to be, other things equal, more expensive than a service delivering either lower quality or at a slower pace. Equally, to reach a lower cost level, it is likely that either quality or speed will be sacrificed to some extent. There’s more about this in an IPR Measurement Commission paper “International Media Analysis Made Simple” that I co-authored.

In reality, “best practice” is thus constrained by whichever two of these three elements are deemed priorities. Additionally, “performance” can also be applied to either the outcomes of a programme (impact on sales, perceptions, voting intentions or other behaviours etc) or processes (internal efficiencies, improved cost/thousand, speed of response etc.).

However, in general, best practice in measurement needs two things. Firstly, adherence to the Barcelona Principles, especially in the areas of research consistency, and focus on outcomes. And secondly, commitment to certain critical operational processes.

After many years of evangelising, it’s now taken as given that measurement is only really meaningful when it’s related to desired outcomes. Reviewing and agreeing the client’s goals are therefore a pre-requisite for any successful, sustainable programme, as this will identify the critical metrics to be included.

Subsequently, best practice demands a measurement framework that captures all relevant metrics at an appropriate level of granularity. The coding frame should be generated from the initial outcomes review. It’s critical to apply basic research standards, especially consistent coding and media source sampling. Be sensitive to different media channels – metrics for “traditional” media are very different from social media; don’t mix and match data from incompatible channels. Create reports that are relevant, visually clear and that adhere to the maxim – “less is more”. Your clients will certainly benefit from this approach!

Q: Combing research, communications and technology expertise sounds like a perfect match but does it really work in practice?

MD: I believe that it’s only by bringing these three elements together that wecan get close to meeting the real challenge for measurement.  In a nutshell, this challenge was posed to me many years ago by one of my biggest clients: “what I need from you, Mike, is to tell me what I don’t already know”. We need all three disciplines working in tandem to meet that challenge – to surface new insights that can help clients manage their forward activities and not just audit their past actions.

I would go further and argue that measurement is stuck in the 20th century largely because it is so far behind in bringing these disciplines together!

Traditional measurement companies, such as the one I co-founded 25 years ago (Report International, which became Salience Insight and is now Carma) are of course heavy users of computer systems to support their media capture and scraping requirements. Especially as social media can only be captured digitally.

However, using technology as an analytics tool per se – the flavour of the month at the month – has all too often been left to software engineers, who still tend to implement platforms with limited analytical insight, largely because they do not start from a research position.

And communicators, with their understandable bias towards words rather than numbers, find it tough sometimes to make sense of data, whether from research-or technology-focused suppliers. However, communicators need that insight in order to manage their changing media landscape effectively – and research companies must understand what communicators need, in order to provide real insight and not just data.

Critical to the success of my own measurement business was my experience in building a PR business in the 80s and 90s. When I moved into research, I brought my comms perspective, combined with some knowledge of research, to the development of programmes that really got under the skin of my PR clients. I could speak their language and created and delivered metrics that resonated with them, and more importantly, that they could immediately use for tactical (programme development) and strategic purposes (planning, branding development).

At TMP, we believe that all three disciplines absolutely need to work together – that the sum of their parts is far, far greater than each individually. Communicators inform the development of the code frame; researchers ensure the code frame can be rigorously implemented, and technologists surface new insight and create presentation platforms that dramatically increase the power and usability of the data.

Q: How would you comment on the latest industry criticism around AVEs following the “infamous” Meltwater white paper that was promoting AVEs for PR Measurement? 

MD: I have previously described AVE as a “zombie metric”. From since before my time as Chair of AMEC, the measurement industry has made, and continues to make strenuous efforts to kill off this most maligned metric. But in a PRWeek UK survey from last year, a third of all communicators indicated that they still used AVE as a key performance metric.

The reality of course, is that clients in the comms space continue to ask for AVEs – after all it’s a simple number to understand (at least superficially), it has a suitably inflated dollar or euro symbol attached, and from their perspective, it seems to be “good enough” to indicate the “value” of their work. I believe clients don’t care terribly much about the protestations of the industry – as long as no-one further up their food chain doesn’t question its value, they will continue to demand it. And the truth is, of course, that if clients demand AVE, there are very few measurement companies that would, on principle, absolutely refuse to provide it, particularly as it is a relatively easy measure to produce.

My colleague in TMP, Guy Corbet (a corporate comms expert) has written a paper arguing that, from a communicator’s perspective, in some circumstances, AVE (or perhaps better, ACE – advertising cost equivalent) does capture a real performance metric, or at least is no worse at doing so than many now-fashionable measures. It will be published shortly on www.measurementpractice.com

The Meltwater piece itself, in reality, said nothing new – Meltwater have never been ashamed of their use of AVE. Indeed, if a third of bill-paying clients still use AVEs, can you really blame them?Of course, from a research perspective, the lack of detail and liberal use of un-explained assumptions (e.g. the multiplier figure they quote) undermined any intellectual credibility they might have been seeking. But I am sure they weren’t looking for that!

More importantly I think, is the fact that the only significant reaction to the piece came from the industry itself. Clients were conspicuous by their absence from the debate. And that, I think, is proof that weaning communicators onto more meaningful outcomes based metrics remains a long, hard road, requiring multiple strategies – from proscribing AVEs in awards entries, to building professional measurement practices into PR university courses and professional development, through to measurement companies showing clients that there are better and more meaningful metrics to reflect and improve their communications effectiveness.

If TMP can support these efforts by improving client education, and helping FIBEP companies build more effective measurement programmes, without recourse to AVE, then we will consider ourselves on the right track!

Leave a Reply

Your email address will not be published.