By Ian Cowley, Head of Data Engineering, Ensono Smarter, faster, better decision making starts by recognizing and removing the hidden-in-plain-sight constraints on your data’s full potential.
The rate at which data is growing is staggering. 181 zettabytes of data are expected to be captured, created and consumed worldwide by 2025. To put that into a semi-relatable context, imagine a terabyte as a single grain of rice. A billion grains of rice, enough to fill a large industrial warehouse, is the equivalent of just one zettabyte; so is the storage capacity of approximately 15.6 billion smartphones. 181 of them? It’s an almost unfathomable number.
Organizations are collecting and storing more data than ever before, all of them wanting to claim the “data-driven” mantle. Unfortunately, few are successfully translating their vast data stores into the kind of timely insights that can drive the better, smarter decision making and real business growth that data-driven dreams are made of. No wonder—it’s a process burdened by legacy structures and capability debt, and prone to a common set of challenges and pitfalls.
Yet those companies that do crack the nut of efficiently and effectively collecting, organizing, analyzing and acting on their data’s full potential will possess the single most important and influential competitive advantage of both our time and the foreseeable future.
Speed is of the essence when it comes to accessing and analyzing data; it’s a factor that directly impacts business agility. It’s also a widespread challenge. In a recent survey of IT leaders and decision makers, 35 percent report that delayed time to insights poses a major obstacle to using data successfully.1 Insights delivered too late leave valuable opportunities in the dust of market progress. That delay typically comes down to a few key issues:
Data frequently resides across various disconnected enterprise systems such as CRM, ERPs, data warehouses and cloud storage. Linking these datasets and the insights they contain can be extremely difficult. But with data fragmented across siloed platforms, there is no single, reliable source of truth. Organizations struggle to leverage information across departments to form comprehensive views of business operations, which prevents effective strategic planning.
Inefficient and often manual processes also hamstring many organizations’ “data-driven” ambitions. In that same survey, 71 percent of IT leaders report spending over 24 hours per week simply gathering data for reports.1 Data accessibility and quality issues due to poor pipelines and human error breed mistrust in insights over time, forcing a regression to reliance on intuition, rather than data, to drive decision making.
Data literacy—the ability of employees to read, analyze and communicate with data—is arguably the most valuable commodity a knowledge worker can possess in the digital age. But it’s in short supply for most organizations. Just 18 percent of IT leaders are fully confident in their staff’s existing skills and training to leverage analytics, a problem compounded by the fact that in many organizations, data analytics and engineering skills remain siloed in narrow IT and analytics teams. 40 percent of IT leaders report that their IT departments bear sole responsibility for data and analytics strategy, creating pressure on those teams and bottlenecks for the wider business, while failing to build broad data literacy.1
Aging, legacy IT systems make collecting, organizing and querying data extremely challenging compared to cloud-based alternatives. Merely exporting data from mainframes for analysis can take 24 hours—an unbearable lag in today’s world. Until legacy platforms are better connected to and integrated with cloud platforms, the incredibly rich business-critical data they contain will remain cut off from timely analysis and reaction.
1“Unlocking your data for better, faster decision making: Key findings,” Ensono.com, 2023.
A fundamental issue underpinning many of these challenges is the lack of a coherent data strategy tied to long-term business goals. 31 percent of IT leaders report updating their data and analytics strategy monthly, while 40 percent make revisions on a quarterly basis.1 This short-term orientation severely limits the ability to drive enterprise-wide change. Without a long-term roadmap for capabilities spanning people, processes and technologies across years, not months, data solutions end up siloed and disjointed, gaps and disconnected insights linger, and value remains buried.
The inability to take a long view also blinds organizations from capitalizing on the full range of opportunities data provides to virtually every business function. While pockets of innovation may occur, the sum remains less than the potential outcome of a comprehensive multi-year strategy, where new analytics use cases and business opportunities would compound year over year.
In today’s data-fueled landscape, a unified data strategy is essential to competitive survival and seizing market share. Without the vision for an insights-driven organization or commitment to investments that can get you there, achieving genuine business impact will continue to remain elusive and limitations will always eclipse possibilities.
Once an organization sees the barriers standing in the way of their data goals, executing a plan to address them is critical. The following steps can provide a template for action:
Assess current use cases and identify gaps
A current state assessment will provide a baseline understanding of how business units use data analytics across core processes. This audit determines specific use cases, tools, data sources, roles involved and pain points for each department. Gaps highlight priority areas for improvement. For example, Marketing may leverage customer data minimally for targeting, while Sales relies on intuition over data to forecast deals. Identifying disconnected insights will expose and quantify hidden innovation opportunities.
Build a single source of data truth
A converged environment—for example, a cloud data lake and warehouse—forms the next-generation foundation for analytics, breaking historical data out of silos while centralizing real-time streaming data onto a single platform that anyone—IT staff and non-technical business users alike—can access, analyze and visualize. With data unified and reachable instead of trapped in operational systems or legacy warehouses, time-to-insight shrinks from days to minutes and critical business decisions can be made quickly with confidence.
Get your mainframe off the sidelines
The data contained in legacy systems has an essential role to play in a reimagined data infrastructure, and cloud migration isn’t the only—and often isn’t the best—way to bring it into the game. Models that connect mainframe and cloud environments can enable the former to do what it does best—support high performance, compliance-heavy systems of record—while enabling the latter to tap all that trapped data, integrate it into agile, highly-responsive systems of engagement and unlock its full value. (See also, “Are you realizing the value of hybrid cloud?”.)
Champion analytics for the people— no PhDs required
For data’s potential to be truly, fully unleashed, access and understanding need to be democratized, with fluency spread across departments, teams and positions. Targeted training programs, ranging from informal lunch and learns to formal multi-week data camps, can teach non-technical employees to shed their fears and become proficient in analyzing and applying insights from dashboards and reports tailored to their roles. Adoption can be accelerated even further with complimentary self-service BI tools aligned to each team’s skill sets, making data interaction intuitive versus reliant on intermediaries. By embedding and nurturing foundational data skills across the enterprise, silos will dissolve and a truly data-driven culture will take root.
Implement a governance model
A defined governance strategy organizes cross-functional usage of data assets. Especially as usage scales across the enterprise, rules for security protocols, regulatory compliance, metadata standards, business glossaries, data lifecycles and ethics provide rigor and prevent a “wild west” scenario. It also helps ensure higher quality, consistent data that users can trust more to fuel decisions and actions. Clean, complete, timely data leads to accurate models and metrics.
Develop a long-term roadmap anchored to business objectives
An effective data strategy roadmap directly aligns short- and long-term analytics adoption to corporate objectives around revenues, customer lifetime value, new product launches and more. The key is grounding data strategy in core KPIs—otherwise, data initiatives risk becoming “science projects” disconnected from real business impact. Anchored to growth priorities, however, the multi-year plan provides the direction necessary to accelerate data ROI amidst complex modernization while optimizing human capital, technologies and processes, and enables targeted scaling of analytical capabilities over the next 3–5 years. Data thus becomes a true enterprise asset.
As mind-blowing as that 181 zettabytes number may be, the volume of data created and collected is going to keep on growing—and along with it, the urgency for organizations to convert their data into powerful, reliable, easily accessible fuel that powers a firing-on-all-cylinders workforce, exceptional customer experiences and extraordinary business outcomes. Prioritize this reality now, execute against a long-term strategy, and your organization will be primed to achieve your grandest data-driven ambitions and the many tangible business rewards that come with them.