As business leaders make plans for the new year, how many will focus on improving the enterprise data quality, reliability, and availability? In many cases, the business value of data is overshadowed by other competing priorities. Since the value is not always obvious, some education and effort are required to understand the positive and negative impact that data has on an organization.
The business case for improving data must be to highlight both the negative cost impact, as well as the positive opportunities that are created by having reliable data, such as predictive analytics and data monetization.
My observation based on many years as a management consultant is that most business leaders and key stakeholders are starved for reliable data. I have seen a variety of responses to this, such as:
- Establishing an internal group of analysts whose primary function is to execute "fire drills" to research answers to executive questions. These analysts must work fast before the original questions become irrelevant. In this situation, a lot of interpolation and extrapolation of information is required – because the data just does not exist.
- Relying on personal intuition and experience instead of data. This can work – IF the executive is among the gifted few whose intuition, experience and vision leads them to make prescient decisions. Many times, however, market forces and rapid innovation are a "speeding freight train", whereas business leaders who are relying solely on experience are standing on the tracks insisting that "no train is coming."
- Hiring a very expensive consulting firm - This normally includes a brief, yet intense, period of interviews and extensive data analysis, then finally results in a well-documented recommendation. In many cases, the recommendation is quite brilliant, but the obstacles to implementation, required investments and risks involved in large scale change conspire to block the realization of projected benefits. Company associates and stakeholders are not fully engaged in the process and are therefore not invested in achieving the results.
Most of the rest of us that are not geniuses or prodigies, have achieved some degree of success by following the practice of "sustained and disciplined effort over time." How then, should we apply this in our quest for reliable and useful data?
I have heard more than one executive express a sentiment: "millions for systems development, not one cent for fixing bad data!" A general perception exists that a dollar spent remediating corrupt data is a dollar wasted. One wise and seasoned colleague developed an excellent business case for a data cleanup project – which was never approved due to "more pressing business needs". In fact, high quality and reliable data is a key resource that enables effective and efficient business processes in an enterprise – and directly enables the value creation process. In truth, running a business on bad data is comparable to trying to swim laps with wrist weights, ankle weights and a weight belt – a huge amount of effort is required just to overcome the forces working against you.
Why does corrupt data not get the attention it deserves? Because its cost is often hidden in the extra effort it demands for an organization's normal processes, such as:
- Requirement to verify/remediate customer data before using it for marketing and sales
- Lack of access to accurate and consistent product data
- Manual processes to manipulate/ massage/ reformat data for financial reporting (which may involve making "estimates" where data is required but does not exist)
- Inability to make performance comparisons within and across business units
- Lack of timely and accurate sales and profitability data
- Simply not knowing whether a product or service is making or losing money
- Inability to migrate data from one system to another without expensive transformation processes and loss of original meaning
Thomas Davenport asserted that the hidden cost of poor data could be as much as 20% of revenue in an average company. Running a company on bad data drives cost, which shows up as a negative impact to the bottom line.
We have heard stories of those who have perished on the ocean due to lack of water. Whereas there was nothing but water around them, it was not useful or fit for purpose. We also find ourselves drowning in oceans of data, yet unable to obtain the critical and useful data that we need. This is true not only of business leaders but also valued associates who are on the front lines with customers and key stakeholders. This data also must be useful and fit for purpose. All key associates of the enterprise need the right data at the right time for the right purpose. The bullet points listed above illustrate problems that occur when reliable data is not available – these also apply when the right data is not available to the right person at the right time.
Both reliable and useful data are required to support the value creation processes of the company, such as sales, marketing, manufacturing, customer service and fulfillment. More importantly, reliable and useful data are a pre-requisite for future growth and innovation.
Useful and Reliable Data as Foundation for Innovation
Visionary strategist Gary Hamel is one of the most insightful leadership experts of all time. A major theme in his writing is a warning: what made your organization successful in the past will not guarantee future success. In fact, much of what makes a company successful in the present may actually be toxic to its future. Some typical quotes:
- "The single biggest reason companies fail is they overinvest in what is, as opposed to what might be."
- "We’ve reached the end of incrementalism. Only those companies that are capable of creating industry revolutions will prosper in the new economy."
- "Most of us understand that innovation is enormously important. It's the only insurance against irrelevance. It's the only guarantee of long-term customer loyalty. It's the only strategy for out-performing a dismal economy."
In order to create competitive advantage, many organizations have turned to new tools and technologies, such as artificial intelligence, predictive analytics and machine learning. These are not ends in themselves but provide a way to create innovative/disruptive data-based business models and new processes to drive growth.
An absolute pre-requisite of implementing new data-based tools and technologies is data that is reliable, useful and available. Poor quality and unreliable data cannot be used as a basis for advanced analytics and will not support improved business results. In fact, some organizations have tried to apply analytics without any effort to improve the quality of the underlying data. No analytical process can magically transform bad data into value added business insight.
The world is not standing still. If your enterprise is profitable, emergent competitors you never heard of are already at work finding innovative ways to capture your customer base.
Business case for investing in high quality, reliable data has never been stronger. Let 2022 be the year that your organization takes bold steps to invest for the future. In order to shorten the learning curve in unlocking the true value of your organization’s data, you need a partner with a depth and breadth of experience in data governance, data quality, data management and data architecture.
Mastech InfoTrellis stands as a ready partner with you to shorten the cycle time from the project start to value delivery. If your company is in the learning phase and ready for agile experimentation, a Data Science Kiosk project can help you in both building a business case and delivering value at the same time.