Measuring impact: are you churning out raw data or actionable insights?
We still seem to practice the same basic "churning out of raw data" that we have done since the online tools and platforms we now use so extensively started spitting out performance data a couple of decades ago. And it still seems to be more of a "reactionary afterthought"; I.e. measuring the success of something to which no clear, tangible success criteria was defined before it was released into the world. When this is the case, there is little to pin those measurements onto - other than the "sheer mechanics"; e.g. the number of times a piece of information was accessed - but nothing about what those who accessed it did with it; whether it lead to any behavioural change etc. So really they are indicators that tell us nothing about whether the given activity has had any meaningful or desired impact on the organisation. Still today, when talking-to members about what the business actually uses the reported data for, you will get responses along these lines:
“… lots of numbers, but no resources to crunch, analyse and act on the data.”
or
“… the numbers digital analytics provide are at best confusing and at worst irrelevant to the business.”
Why is it so hard to make the change and move beyond simply reporting the raw data generated by the tools we use to tying these numbers and the way they develop to actual business objectives; to activities that are telling us something about the success or otherwise of the business we work for?
Having observed this field and how professionals working communications, HR, IT and other functions operate for many years, my conclusion is that there are a couple of reasons why this is not happening. Firstly, many are simply stuck in a way of doing this that has already become a well established norm. Years of churning out the same type of unprocessed data, perhaps presented through some graphs, diagrams and other visualisations has simply created an expectation from the rest of the business - or at least that is how it is perceived - of "this is how we do it".
Secondly, the seemingly simple task of identifying the right meaningful metrics is both time consuming and requires a highly flexible and adaptive approach - as it turns out that a "one size fits all" model does not work. The type of activities you have to observe to decide whether what you are monitoring is moving in the desired direction can be very different in nature from time to time and the "instruments" and "reporting methods" will consequently also have to be different. While it is fairly straightforward to decide on some principles and guidelines for this type of work, it is almost impossible for those reporting on these things today to find the time and resources to go through the motions of identifying the right metrics, measuring tools etc. for every activity or initiative they are asked to monitor without extra resources and manpower being added.
Moreover, it requires a different way of thinking about measurement than the one typically deployed by our members today; it requires consultation with the various business areas and stakeholders to find out what their precise objectives and success criteria are and a conversation about whether a correlation can be made between those objectives and the activities we are measuring.
The different perspectives of the captain and the engineer;
distinguishing between operational and business analytics
To illustrate, here is an example from a member that always stuck with me: one of how to distinguish between operational analytics, which is important to the specialists or subject matter experts and business analytics, which is important and interesting to the business at large.
This particular member worked at a company that builds ship engines; some of the best in the world. These engines are so sophisticated that they are able to predict with great accuracy when key components in the engine I likely to fail I need replacing. Moreover, the engine can transmit this prediction data via satellite to the headquarters, where the relevant engineer can get the spare part ready and be underway; fly out to whichever destination in the world and be ready at the terminal when the ship arrives.
The engineer can embark and change the part immediately while the ship is offloading and loading, allowing the ship to depart without delay to the original schedule. Commercial ships are always on a tight schedule and every minute that is lost can cost the shipping company lots of money, so being able to predict the type of scenarios that require specialist attention and have the specialist in place when the ship next docks is hugely valuable.
Why am I telling you this story? Because it contains the two types of analytics referred to earlier; operational and business analytics. There is the data indicating that a specialist is required, in this case the performance and prediction data transmitted by the engine to the headquarters. This is “operational analytics”; data that is both interesting and useful for the specialists; those working in the engine room and who rely on exactly this type of data to make decisions and do their job. Operational analytics is interesting and important to those working in the “engine room”, and rarely make sense to anybody else. This is not exclusive to people working with ship engines or digital tools; most professions will have their own specialised data sets that are of little interest and make no sense to people not schooled in this profession.
What is interesting to the business at large?
The other type of analytics contained in this story is that reflecting the impact the engineer is having through carrying out his work. The most important impact is the amount of money he is able to save the company by getting the ship safely off after service without any delay. This is more strategic in nature and is exactly the type of “business analytics” referred to earlier. In other words: the interesting thing for everyone else in the business is the result of the effort of the engineer as it translates into a saving on the bottom line. The method he used and the operational data he relied on to successfully get there is of little or no interest to anyone outside his own department - and requires specialist knowledge to understand - but when translated into ‘impact on the bottom line’, the value of his or her work suddenly becomes much clearer to everyone.
The challenge for most working with digital tools, communications and information flows generally is that we are reporting our ‘operational analytics’ to the business around us; performance data provided by our tools and engines, which can guide us and ease our decision making, but which make little sense to outsiders and lead to the type of quotes at the top of the article; confusing at best…
What are you reporting?
Are you reporting operational or business analytics? Do you know the success criteria and objectives for the key decision makers in your organisations - and have you made the link between those and the work you are doing? Are you reporting anything at all?
Join the discussion
Over the coming months, we Will be exploring ways of developing good solid and meaningful frameworks for during impact. Frameworks that are both workable, understandable and valuable to both those reporting and to the business at large. Look out for events and activities focusing on this. One I’d like to highlight is the launch of the EXPLORER group “Finding new ways of measuring impact”, which we launch on 5. January 2023.
ConnectMinds EXPLORER
The EXPLORER format allows participants to dive deep into the details of the topic in question and look at it from any number of angles, test out ideas and thus get a much more robust knowledge base; perhaps even collaborate to create something tangible together that can be used back in the respective participating organisations. For more info about EXPLORER groups and the various topics - including Employee Advocacy, planned launch dates, how to sign up and how to pitch your own topic idea for an EXPLORER group, visit the EXPLORER page on our site.