Raise your hand if your company is making more than 15!
Strategy
- Day-dreaming that analytics is a plug & play magic wand that will bring very short term ROI. Well executed basic excel models might have brought quick wins in the 2000s but advanced analytics requires some time. Analytics is never plug & play because plugging data into models is extremely lengthy, because learnings are not transferable across companies or markets and because they require a high OPEX in people and high CAPEX in systems.
- Solving problems that are not really worth solving, which results in a waste of time and resources. Analytics is not about solutions looking for problems but problems looking for solutions. Questions such as “What can we do with blockchain?” do not make sense. “How can I solve my marketing problem” is a question that makes sense. The worst mistake of the Chief Data Analytics Officer is not having an extremely clear view of what key challenges and opportunities each functional area is confronted with.
- Relying solely on vendors or consultants for analytics, especially on model creation. The post-mortem of how corporates fail developing capabilities with consultants is as follows: the client hires a consultant to deliver a project and at the same time develop internal capabilities. The client has far too unrealistic expectations about the impact of the project and consultants never say “No” and oversell the project. The impact does not materialize and one day the client tells the consultant “If you do not get some impact in the next month, I will stop your contract”. That day capability development officially dies, if it had ever existed. RIP. A few million dollars in the trash bin. Anyway, analytics is the brain of the company. How could corporates even think they could outsource it? working with vendors and consultant can work but the governance needs to be thought through.
- Not developing a fully comprehensive list of priorities. Since you can only count with five fingers in one hand, therefore Management should pick at most five metrics rather than making everything seems important.
- Saying yes to random requests, like pet projects or glamorous visualizations and reporting which often result into analysis-paralysis syndrome.
- Assuming that abstinence from external data monetization or from cloud is the solution to data privacy and security. While there are some regulatory restrictions in some industries and countries, and sometimes even ethical limits, external monetization and cloud done properly do not necessarily involve security risks.
People
- Organizing analytics under functions which do not drive the business on a daily basis such as IT or strategy. Analytics is only powerful if it is coupled organizationally with daily operations.
- Letting multiple analytics teams flourish with organizational siloes among them. Analytics needs to keep an integrated view of the business.
- Attracting talent only through base compensation. Instead it is necessary to build a sense of purpose, to create a powerful employer brand and to develop internal talent.
- Hiring a bunch of PhDs who strive to develop highly nuanced models instead of directionally correct rough-and-ready solutions, hence they fail to prove actionable insights. So, hire highly coachable fast learners (even if they hold a PhD).
- Hiring a technical Chief Data Analytics Officer or hiring a non-technical Chief Data Analytics Officer. Instead he needs to be both: technical enough to coach his team and business-driven enough to understand business problems.
- Not bringing domain experts and internal business consultants to the analytics teams to bridge the gap between business leaders and analytics teams to ensure an end-to-end journey from idea to impact.
- Neglecting the creation of a data-driven culture through active coaching across the whole organization from sales agents to the CEO, especially sales agents and the CEO.
- Not being objective enough and remaining biased to the status quo or leadership thinking. Analytics teams deeply embedded in business functions or BUs are more likely to have these troubles than centralized ones. This is why some organizations create quality control teams.
Execution
- Not embedding analytics in the operating models and day-to-day workflows. This will result in a failure to integrate technology with people.Using analytics as part of their daily activities helps users to make data-focused judgement, to make better-informed decisions, to build consumer feedback into solutions and to rapidly iterate new products, instead many are still relying on gut feelings and Hippos on decisions (Highest Paid Person Opinions)
- Not collocating data scientists with the business teams they support. Otherwise they will not talk to each other.
- Managing analytics projects in waterfall. Parameters of a model cannot be known upfront. They are determined through an iterative process which looks more like an art than a science. Therefore analytical projects need to be iterative by following, for example, the Agile Framework.
- Not being able to scale analytics pilots up. Analytics often starts piloting use cases Companies often end up killing pilots as soon as they need to reallocate funding for other shorter-term initiatives.
- Neglecting data governance as a fundamental enabler. Data governance refers to the organization, processes, and systems that an organisation needs to manage its data properly and consistently as an asset, ranging from managing data quality to handling access control or defining the architecture of the data in a standardized way.
Technology
- Trying to create data science models without refining your data engineering infrastructure: cleaned repositories, efficient engines and streamlined extract-load-transfer processes. Data engineering without real use cases to model is also wrong. Both modelling and engineering must go in parallel or in an iterative way.
- Not using any of the following basic technologies: Hadoop, Spark, R, Python, an advanced visualization tool of your choice, and a granular self-service reporting system open for the whole organization.
- Having technological siloes among data repositories which makes it difficult to integrate different kinds of data into a model. The power of analytics increases exponentially with the diversity of data.
- Not automating analytics through A.I., which can be an extremely smart assistant to data scientists. A.I. automations help data scientists to cleanse data, to check for correctness, to deploy models, to detect relevant prediction features and obsolescence of models, or even to generate hundreds or thousands of variations of models. All in all, the analytics strategy of the business has to be a subset of the whole A.I. strategy since the datasets needs to feed the A.I systems.
Finance
- Not allocating enough budget for analytics platforms, but yet still keeping Shangri-La dream expectations. And the opposite is also an error, allocating more than enough money which have no direct correlation to business outcomes.
- Not measuring the ROI of analytics initiatives. We know ROI is mid-term but that does not mean you do not measure it.
Disclaimer: Opinions in the article do not represent the ones endorsed by the author’s employer.
This article was taken from LinkedIn.
Photo by Sarah Kilian on Unsplash.