Blog

Want to Judge Enterprise Innovation? Measure Earnings per Byte First

Dec 20, 2017

Want to Judge Enterprise Innovation? Measure Earnings per Byte First

If you plan on remaining a viable, long-term part of the marketplace, it’s a necessity for your organization to utilize data through new and increasingly profitable methods. Those businesses that derive ongoing improvements and insights from their data are more proficient at spotting emerging opportunities and remaining ahead of the bell curve. The logical question is: how does an organization measure its own ability to use predictive insights in effective, valuable, and creative ways?

Just because the volume of your data is fathoms deep does not mean it is being maximized to its full potential. Unless organizations are measuring data value at that level, opportunities to improve revenue streams, increase operational efficiencies, and delve into new and profitable products and markets will continue to go unnoticed.

What Needs to Be Measured for Your Full ROI

As with any data strategy, organizations need to begin with a collective, enterprise-wide view. The act of data governance needs to precede any attempt to get complete and accurate data quality metrics. Disjointed analysis only produces localized insight, potentially addressing a specific department’s data quality management but not the enterprise as a whole. Organizations will be better positioned to analyze the value of their data-driven innovation as it relates to their financial growth once their data vision expands.

Certain performance measures get close to assessing data impact but miss the objective by focusing on the end result without investigating the value of the source material. Return on Innovation Investment (ROII) is one such metric, measuring the effectiveness of enterprise innovation. An approach that might propel this metric even further toward our goal is to take Harvard Business Review’s recommendation and combine it with Dupont’s return on equity breakdown (stay with me) to subdivide the measurements in ways that are better at reviewing the quality of data assets:

  • Innovation Magnitude (financial contribution divided by successful ideas)
  • Innovation Success Rate (successful ideas divided by total ideas explored)
  • Investment Efficiency (ideas explored divided by total capital and operational investment)

Though it looks as if this is closer to the type of performance metrics needed, the measure still falls short. This approach is like recognizing the value of using steel as a building material without measuring the strength and quality of the metals going into the alloy.

Digging deeper to evaluate both enterprise and analytics department efficiencies would require a measurement that compares how levels of data-driven innovation correlate with financial returns. A measurement of enterprise earnings against data assets, Earnings per Byte (Tera-, Meta-, etc) would provide both senior management and investors with insight into the translation of data into meaningful value. With industry usage of such a metric on a more comprehensive scale, it would even prove valuable as organizations work on attracting new investors. Not to mention, senior executives would find comfort that their IT or Data departments’ investments and analytic innovations are impacting the organization in an irrefutable manner. But before that metric is created and leveraged, there needs to be a common definition about data quality metrics.

Quantifying the Value of Data Assets

Given the amassing of data assets – internal gathered or externally purchased, structured and unstructured, thoroughly analyzed or new potential data streams – a fundamental question exists: is the financial and operational investment to support these assets worth it? To properly assess this in combination with an Earnings per Byte measurement, categorization of data sets would prove valuable.

  • Traditional Operational data – This is the data that organizations already understand how to gain insight out of, report from, and utilize operationally. Examples include customer demographics, production costs and profit margins, product inventory, and supply chain KPIs.
  • Exploratory data – These data sets are not used on a regular basis and quite often remain latent and untapped, despite their potential for generating new revenue and reducing costs. Examples may include purchased data sets, connected home/IoT data, member biometrics, drug efficacy sensor data, etc.

Once categorized, both operational and exploratory data would need to be evaluated differently. Consider:

  • Operational EPB – Calculations for traditional earnings per byte would signify the organization’s ability to utilize traditional data elements or features, further capitalizing on or maintaining steady state financial gains.
  • Exploratory EPB – Calculations for exploratory earnings per byte would act as a barometer of the organization’s forward progress in usage, potential opportunity, and market leadership in terms of creativity. It could identify new investment opportunities for deeper insight or notify decision makers when it’s time to update archival policies or make overall assessments of data management.

The Common Thread for Effectively Measuring Data Value

The complexities involved with defining and monitoring data to evaluate its monetary value are great, and there is one prominent obstacle to this type of evaluation: the viral way in which the size and scope of data grows. Consider how a terabyte of data used to be significant and how it is now simply a drop of water in the bucket. By the time everyone adopts this performance metric, we will have to expand the amount of data gathered, keeping the metric itself as dynamic as the marketplace.

Whatever businesses do, they will need to know their data inside and out to continue to decode the intrinsic value located within their data lakes. Thorough data quality management, efficient data integration, and a clear strategy for aligning analysis with enterprise objectives will be crucial to ensuring the greatest earnings per byte. In most cases, that means collaborating with an information management partner to identify the most impactful action items necessary for demonstrable results.
 

[fusion_text]
 

[/fusion_text]

Related Articles:

How to Increase Your Odds of Getting ETL Developer Jobs

3 Common Practices Inhibiting the Potential of Your Data Analysis Tools

3 Data Analyst Skills Companies Want You to Have

[/fusion_builder_column][/fusion_builder_row][/fusion_builder_container]

Recent Articles

Should You Search for a New Job in an Uncertain Market?

Don’t quit your day job: that’s the argument Business Insider appears to be making in a recent article about finding a job in 2025. They point to dwindling job openings and cautious companies revising their plans for the year. Though we understand why some people want...

3 Success Stories to Inspire Your Financial Services AI Strategy

A mature AI strategy isn’t just a nice-to-have—it’s table stakes, driving insights that transform customer experiences, boost operational efficiency, and maximize profit margins. That said, many banks still struggle to turn their vast stores of data into actionable...

4 of the Fastest Growing Healthcare Jobs to Consider If Changing Careers

If you’re part of the “laptop class,” chances are good you’re reexamining your career options these days. Lots of professionals recognize that AI agents and other tools will be able to handle more of their workload, decreasing the demand and stability for these types...

How Advanced Analytics and AI Are Re-Shaping Risk Management in Banking

Banks have always been at the vanguard of risk management, but only in recent decades has technology approached the level of confidence leaders desire as they make lending and investment decisions. Thanks to advanced analytics and AI, financial institutions now have...

Ready to Find a New Job in 2025? Your Worn-Out Strategies Won’t Cut It

You’ve set your job alerts. You’ve updated your LinkedIn profile. If you haven’t searched for a new job in a while, it might feel these actions are enough to find new opportunities. Unfortunately, that’s not true in the current market. According to The Wall Street...

How Low-Code No-Code Platforms Can Transform Complex Industries

The ability to access and leverage structured and unstructured data has empowered organizations to understand challenges, predict outcomes, and prepare for a better future. However, unlocking the insights hidden in that data requires alignment between the analytical...

Share via
Copy link