Want to Judge Enterprise Innovation? Measure Earnings per Byte First

Why the Way You're Measuring Data Value Misjudges Your ROI

If you plan on remaining a viable, long-term part of the marketplace, it’s a necessity for your organization to utilize data through new and increasingly profitable methods. Those businesses that derive ongoing improvements and insights from their data are more proficient at spotting emerging opportunities and remaining ahead of the bell curve. The logical question is: how does an organization measure its own ability to use predictive insights in effective, valuable, and creative ways?

Just because the volume of your data is fathoms deep does not mean it is being maximized to its full potential. Unless organizations are measuring data value at that level, opportunities to improve revenue streams, increase operational efficiencies, and delve into new and profitable products and markets will continue to go unnoticed.

What Needs to Be Measured for Your Full ROI

As with any data strategy, organizations need to begin with a collective, enterprise-wide view. The act of data governance needs to precede any attempt to get complete and accurate data quality metrics. Disjointed analysis only produces localized insight, potentially addressing a specific department’s data quality management but not the enterprise as a whole. Organizations will be better positioned to analyze the value of their data-driven innovation as it relates to their financial growth once their data vision expands.

Certain performance measures get close to assessing data impact but miss the objective by focusing on the end result without investigating the value of the source material. Return on Innovation Investment (ROII) is one such metric, measuring the effectiveness of enterprise innovation. An approach that might propel this metric even further toward our goal is to take Harvard Business Review’s recommendation and combine it with Dupont’s return on equity breakdown (stay with me) to subdivide the measurements in ways that are better at reviewing the quality of data assets:

  • Innovation Magnitude (financial contribution divided by successful ideas)
  • Innovation Success Rate (successful ideas divided by total ideas explored)
  • Investment Efficiency (ideas explored divided by total capital and operational investment)

Though it looks as if this is closer to the type of performance metrics needed, the measure still falls short. This approach is like recognizing the value of using steel as a building material without measuring the strength and quality of the metals going into the alloy.

Digging deeper to evaluate both enterprise and analytics department efficiencies would require a measurement that compares how levels of data-driven innovation correlate with financial returns. A measurement of enterprise earnings against data assets, Earnings per Byte (Tera-, Meta-, etc) would provide both senior management and investors with insight into the translation of data into meaningful value. With industry usage of such a metric on a more comprehensive scale, it would even prove valuable as organizations work on attracting new investors. Not to mention, senior executives would find comfort that their IT or Data departments’ investments and analytic innovations are impacting the organization in an irrefutable manner. But before that metric is created and leveraged, there needs to be a common definition about data quality metrics.

Quantifying the Value of Data Assets

Given the amassing of data assets – internal gathered or externally purchased, structured and unstructured, thoroughly analyzed or new potential data streams – a fundamental question exists: is the financial and operational investment to support these assets worth it? To properly assess this in combination with an Earnings per Byte measurement, categorization of data sets would prove valuable.

  • Traditional Operational data – This is the data that organizations already understand how to gain insight out of, report from, and utilize operationally. Examples include customer demographics, production costs and profit margins, product inventory, and supply chain KPIs.
  • Exploratory data – These data sets are not used on a regular basis and quite often remain latent and untapped, despite their potential for generating new revenue and reducing costs. Examples may include purchased data sets, connected home/IoT data, member biometrics, drug efficacy sensor data, etc.

Once categorized, both operational and exploratory data would need to be evaluated differently. Consider:

  • Operational EPB – Calculations for traditional earnings per byte would signify the organization’s ability to utilize traditional data elements or features, further capitalizing on or maintaining steady state financial gains.
  • Exploratory EPB – Calculations for exploratory earnings per byte would act as a barometer of the organization’s forward progress in usage, potential opportunity, and market leadership in terms of creativity. It could identify new investment opportunities for deeper insight or notify decision makers when it’s time to update archival policies or make overall assessments of data management.

The Common Thread for Effectively Measuring Data Value

The complexities involved with defining and monitoring data to evaluate its monetary value are great, and there is one prominent obstacle to this type of evaluation: the viral way in which the size and scope of data grows. Consider how a terabyte of data used to be significant and how it is now simply a drop of water in the bucket. By the time everyone adopts this performance metric, we will have to expand the amount of data gathered, keeping the metric itself as dynamic as the marketplace.

Whatever businesses do, they will need to know their data inside and out to continue to decode the intrinsic value located within their data lakes. Thorough data quality management, efficient data integration, and a clear strategy for aligning analysis with enterprise objectives will be crucial to ensuring the greatest earnings per byte. In most cases, that means collaborating with an information management partner to identify the most impactful action items necessary for demonstrable results.