Analytics is a wave that organizations cannot ignore. Either they can embrace it and become early adopters to have sufficient first-mover advantage, or be followers to the party and learn the game from others to tread cautiously. There is no third. But while companies are experimenting with the right approach to analytics to solve real business issues, from our experience, there are 10 areas where they face the majority of the challenges. The most interesting part of this is that none of those challenges are actually related to analytics but the whole paraphernalia around it. So what are these?
1. Insufficient data. Is that a reason they should not proceed? Definitely no. Let us face it, analytics will be necessary to stay competitive. But avoiding the transformation is not the solution, as, without the same, it will be impossible to expand the data sources in depth and breadth. Putting together a phased approach to analytics migration will provide the necessary implication on the data journey. In fact, companies who go analytics first and data second tend to use resources better than the ones who do vice versa as the exact nature and requirement of data is understood well before implementing solutions. In one of our experiences, a client had a CRM but was not capturing any data from the platform. Putting in place the analytics tool created a need in the organization to collect the necessary customer data, which powered the analytics solution to suggest customer-centric services.
2. Data exists, but it is largely disintegrated. This is somewhat the lesser of the devils. Data extraction and integration solutions have become highly advanced. Using different tools, companies can possibly extract and integrate data from a diverse set of data platforms and file types. The important bit is to understand what data is required and put together a scalable plan for integration. At Praxis, we have partners that provide integration services across a wide range of enterprise software products, so you are in safe hands when working with us.
3. Data inaccuracy. Let us face it, not all data will be correct. When we have worked across organizations, 20-30% of the data has had questionable accuracy. But the good part is that all the organizations face this issue. The solution to this is to a) clean the existing data one time using 80:20 approach and b) understand the ground issues that are leading to noise, and put in place processes to rectify them.
4. Cross-functional silos. In large organizations, data sits in different departments, and integration of various functions is fairly critical to the success of analytics use cases. What is needed is time, for such situations. When we work with large organizations, we spend significant time pre-wiring and on-boarding critical stakeholders with the data and it’s sanctity. Functional SPoCs are very critical to define ownership of different data-sets. Praxis’ programmatic approach to analytics is tailored to large organizations that need a longer-term and deeper partner.
5. Process to run analytics: Creating a tool powered by AI and Machine Learning is one thing, but putting in place a process to use that tool is another. Given the number of stakeholders, decisions and considerations that are present in a complex environment, the analytics engine design, usability, and functionality needs to bake into the process that will be put in place to use the tool. Permissions, user access, sequencing of tasks, refresh intervals, data usage etc., all need to be tuned to be in sync with the process. At Praxis, this is one of our biggest differentiation. We strive to create analytics as a capability within an organization and not just a mere tool / solution.
6. Lack of skilled resources: Technologies used in analytics can be highly diverse, from data cleaning / base-lining to algorithm development to visualizations to system integration techniques. The stakeholders that need to run the analytics capability need to be well versed with the kind of technology that the solution uses. An organization must hence start from the top to set up an analytics capability, and grow the team as and how different use cases are implemented. It is also critical to hence define the tech journey w.r.t. analytics early on.
7. Outcome orientation: This is somewhat a more self-created problem by most of the analytics companies in the industry today. The typical approach to any engagement is to look at the data, do some exploratory analytics and then design a solution. At Praxis, we flip this completely. We first diagnose the issue, create a hypothesis and immediately mock up the full solution we plan to build (defining the dashboards, features, functionalities, etc.). This brings our clients, key stakeholders and any other parties, fully on the same page and aligned with the outcomes of the solution very early on in the project. Only after full alignment do we initiate the bulk of the analytics work.
8. Lack of patience: A lot of analytics tools use machine learning codes to improve the algorithm over time. To recap, early mover advantage in analytics is strong – companies that adopt early extract value disproportionately. Thus, it is important to give time for the engine, data base-line and the internal capability to mature, to fully realize the value of a solution. Typically, we see an analytics solution delivering its full value in 1-2 years of implementation.
9. Preventive processes for data collection: Post a data cleaning and base-lining exercise, it is of utmost priority to put in place processes and checks such that future data collection is of required accuracy. These can be simple sensitizing trainings, to complex technological upgrades. But without these, analytics solutions will crumble as feeding in cleaned data will become a mammoth exercise for employees and ultimately it will slip the cracks.
10. Overloading: This is the final but most common cause of limited success of analytics projects, and it is more evident in first-time users. Clients are typically hungry for use cases and impact with analytics, rightly so, but sometimes a “do it all at once” approach back-fires and the primary intent gets lost. Overloading an analytics tool with multiple features / functionalities is not generally a good idea. Best way to move forward is the MVP approach – a core use case is executed, and then features / functionalities around it are continuously added and implemented.