9 Lessons from Stax’s experience in accelerating value for Private Equity sponsors through data
For 28 years, Stax has answered valuable questions through digital transformation engagements and implementation projects. We combine commercial acumen, analytics, and technology expertise to deliver data-driven, action-oriented outcomes for private equity sponsors and their portfolio companies.
We have worked across diverse industries of private equity-backed companies to accelerate value during hold periods. Our clients engaged Stax with the sole transformative purpose of increasing organizational access to data and empowering leaders, at all levels, to make data-driven decisions.
Stax’s team of data scientists, data engineers, business analysts, and technology consultants not only design a long-term roadmap for data and analytics transformation, but also implement its key recommendations through the creation of harmonized master datasets, the development of tools and solutions for the management and enrichment of golden records, and the formulation of processes and client-side teams for systematic data governance.
During this journey, we uncovered nine key lessons for running large-scale digital transformation projects:
Lesson 1: Play Partner, Not Service Provider
From the outset of the engagement, Stax has positioned ourselves as “partners in business,” available to solve the organization’s analytical challenges collectively and collaboratively. While we take pride in the designation, we understand that “Consultant” can sometimes be misinterpreted as only a provider of outside opinions, expert advice, and recommendations.
Adjusting our narrative, tone, and language to come across as partners who were in it for the long haul and ready to walk our talk allows us to form better working relationships with the client’s key decision-makers.
These good relationships have been instrumental in helping us lift people’s natural filters and restraints as we progressively delve deeper into the roots of various systemic issues, comprehending finer aspects of the company’s data issues and their interdependencies.
Lesson 2: Instill a Strong Sense of Ownership
A good plan with organization-wide support will always outshine the perfect plan with no backers. As part of our “Analytics Roadmap” initiative, we conduct interviews with senior leaders across the organization as well as with the private equity sponsor’s deal team.
The analytical requirements and challenges we gather are synthesized into a myriad of individual projects and key recommendations, which are validated and prioritized via a series of working sessions with a team of senior cross-functional representatives. During these working sessions we encourage every representative to freely share their opinions, but more importantly, freely push back against whatever they believe to be of insignificant or low priority.
As a result, the Analytics Roadmap – essentially a transformation plan for the company’s use of, and access to, data and analytics – often garners a great deal of excitement and approval from the company’s executive leadership team.
Lesson 3: Transformation = Disruption, So Think Low Disruption, High Impact
Every transformational activity, by definition, disrupts the operations of an organization. However, no organization can afford total disruption, especially with the Covid-related challenges that now characterize our economic environment. What is then required is the equivalent of fixing an airplane as it flies.
During the discussion and formulation of the Analytics Roadmap, Stax develops a Business Impact & Required Effort framework to identify all analytics transformation projects with a high degree of business impact and low level of disruption.
While this can be challenging based on the individual systematic and process-related complexities that an organization has within itself, Stax collectively strives toward solving the needs of the commercial and strategic functions that would help us quickly generate the greatest returns.
By allowing the commercial and strategic functions to better understand, not only the market and their customers, but their historical relationship with each customer, their share of the market, and the vitality of their product portfolio, clients would be able to better align the evolving (and highly complex) needs of their customers and the dynamics of the market with their customer-facing and operational activities.
Lesson 4: Constant Communication Unlocks Significant Time-Savings
Providing the foundational data to enable commercial and strategic insights requires a large-scale effort to clean, standardize, validate, harmonize, and complete disparate datasets from an organization’s disparate systems.
Maintaining a regular line of communication with a team of cross-functional representatives during the entire process allows us to supercharge our efforts by learning from the operational and historical context they provide to:
Lesson 5: Sustain Interest by Continuously Demonstrating Value
It is natural to expect fatigue to set in during any long-term project and subsequently for a project’s stakeholders to lose track of its vision and objectives when dealing with the minutiae of data issues. As a result, it’s important to keep relaying the project’s purpose and more importantly, to keep demonstrating the creation of value as frequently as possible.
During our weekly catch-ups with the client’s core team, in addition to sharing progress updates, we also made sure to share our discoveries, our evolving thoughts on potential technological solutions, and even work-in-progress datasets and analyses.
Adopting a “show your workings” approach enables us to win client confidence in our process and eventually ensures their expectations are met upon handover of our deliverables.
Lesson 6: “Data-Driven” Requires Robust Infrastructure
While every company aspires to be able to create value through artificial intelligence and machine learning, in our experience, only a few are positioned to do so. Advanced applications of data analytics require a robust data infrastructure that enables the governance and maintenance of cleaned, complete, comprehensive, harmonized, and accurate sets of data collectively representing a variety of domains (e.g., master customer data, master product data, etc.).
To sustain the master datasets that we manually assemble for a client’s commercial and strategic functions, we develop a Master Data Management (MDM) solution hosted on Azure, plugged into every internal system, and connected to a custom-built data quality management portal that allows data stewards to immediately address a variety of data quality issues as they arise.
Importantly, we ensure that the Master Data Management solution is developed with scalability in mind, enabling the client to easily extend its functionality to other functions (and thus data domains) and thereby progressively capture the analytics use cases of the entire organization.
Lesson 7: Not all Data Problems are Technology Problems
Even the most well-intentioned technology cannot magically resolve business problems that are defined by behavioral and process issues. Conflicting terminology, varying calculations of the same metric across functions, errors during data entry, unsynchronized data cleaning activities, and data duplication are all examples of issues that require the formulation and strict enforcement of standard operating procedures. Hence, in addition to providing a robust data infrastructure, we assist with creating a Data Governance Council and hiring a presiding Master Data Manager.
The Data Governance Council serves as the link between data consumers and data producers, aligning data capture and systems use with analytical needs. The Master Data Manager oversees the activities of the Data Stewards that comprise the Data Governance Council to ensure that the client’s master data is constantly harmonized and free of data quality issues.
Lesson 8: Specific Business Problems Require Specific Solutions
Against the backdrop of increasing market intelligence that a client should prepare for a ramp-up in demand and an adjoining need to centralize sales, inventory, and operations planning activities, Stax works toward understanding how the monthly demand forecasts provided in varying formats by each customer could be consolidated and combined with internal data to generate an organization-wide view of customer demand.
We realize that due to the nuanced parsing each forecast document requires and the complexities involved in merging internal data with the consolidated view of customer demand, a bespoke solution is often required. Through a series of product design workshops, Stax develops a forecast document consolidation and enrichment tool using Python, Django, and React, reducing the time needed to consolidate monthly forecast documents from roughly two weeks to ~half a day, and effectively providing the company the foundational data required for demand planning and sales outreach prioritization.
Stax maintains that our in-house solutions are always designed with cross-system integrability in mind, thereby enabling all data cleaning and consolidation activities to be administered via a single platform.
Lesson 9: You Can Do a Lot More Remotely Than You Think
Since the outbreak of the Covid-19 pandemic, Stax has strived to leverage its hybrid work culture to optimize program delivery and speed up solution development.
A few best practices that allow us to make it work seamlessly include:
Stax brings value to clients throughout the portfolio company lifecycle in a way that solves for many of their challenges. With over 28 years of experience, we differentiate from others by bringing in a unique combination of strategy, data and technology understanding of private equity and portfolio company outcomes. Our deep expertise across different industries and their value chain has allowed Stax to unleash clients’ understanding of their markets and growth opportunities! All of this was done remotely – saving a significant amount of time, effort, and money!