Open a newspaper, watch a talk show or listen to any conversation. The term 'the new normal' can't have escaped you. We can cautiously state that it is already part of our collective memory. This is a wake-up call that shows that the world we live in is in constant change. That today's 'normal' doesn't have to be tomorrow's 'normal'. As a result, consumer behavior and needs have shifted significantly in recent weeks. We see this in the purchasing behavior for example. The demand for certain products is very different from what we consider normal. While demand for swimwear is declining as the May sunshine holiday is cancelled, supermarkets are seeing an increase in sales of luxury products as people now eat out more 'at home'. How do you deal with this and what effect do these changes have on the effectiveness of the data-driven solutions you have in production?
Data driven software solutions are often still seen as static; as a one-time implementation. In today's rapidly changing context, this is not tenable. An algorithm is good at analyzing large quantities of data, recognizing patterns and making predictions about the future based on information from the past. However, the algorithm is only effective if the data it uses is relevant. The algorithm lacks emotional intelligence and therefore has only limited or no understanding of the environment in which it operates. It does not watch the news or read newspapers. It is therefore unable to assess the relevance of data from the past in the event of drastic changes. This input must be given to the algorithm for maximum performance, in order to remain relevant and up to date. This means that if algorithms do not move along with the new context, the value and relevance will decrease further and further.
In extremely fast changing circumstances, you want control over the algorithms that are in production and you want to be able to make timely adjustments that correct the model for the changing environmental factors. With a standardized software solution with a static structure this is often not possible. Even if the possibility is there, you remain dependent on the software provider and there is a good chance that the turnaround time will not be fast enough.
The solution can be found in a dynamic data science structure, where the best of consultancy and software come together. Within this unambiguous and manageable structure, the algorithms can be quickly and easily adapted to the new reality. Looking at consultancy, the lack of a manageable structure makes this a time-consuming and complex task. Data science and personalization work optimally when both humans and algorithms are put in their power.
Human monitoring is decisive for timely intervention. The algorithm is self-learning and automatically adapts to gradual changes. But in extremely fast changing times, the combination of human creativity, business knowledge and algorithms is an unbeatable combination. ‘Alerts' can help in recognizing abnormal behavior. If, for example, you see that your personalized recommendations no longer yield the expected conversions, then this can be a sign to intervene. By thinking from the business context and validating this from the data, these deviations can be explained. These insights can then be added to the algorithm by, for example, drawing up new business rules and/or selecting more relevant data. In this way, the algorithm learns to understand the new context. This keeps you relevant for your consumer and avoids misplaced recommendations.
Is this Corona crisis an exceptional situation? Absolutely! The change takes place in an extremely short time frame and is quite resolute. However, consumer preferences and expectations are constantly changing. Whereas a few years ago consumers only shop online incidentally and purely for convenience, they now expect a total experience with personalized service. Whereas consumers only contacted the customer service center by phone between 9 and 5 a few years ago, modern consumers now expect that organizations can be reached 24/7 through multiple channels. Nowadays this is called on-demand and omni-channel.
In order to remain distinctive as an organization in the long term and not lose the connection with the consumer, you are forced to continuously adapt business strategies to these changes, large or small. This is no different for your data science strategy. The solutions that are in production will have to move along with the changing context. With a standardized solution, you lose relevance to the consumer over time and dilute the fit with your strategy. The lack of a data science structure in pure consultancy will lead to long lead times and high costs.
The use of a dynamic data science structure is therefore not only important in the short term. You want a solution that grows along with the developments within your strategy, the consumer expectations and the technological possibilities. In this way you can continue to innovate and develop your technology stack. Because of the unambiguous and manageable structure, it is relatively easy to add new elements, personalize service on other channels based on the same insights and integrate new knowledge. In this way, you improve your service, remain relevant to the consumer and build up your distinctive character.
Seeing data science as a one-off implementation, and as software that keeps itself up to date, is an important pitfall that hinders you in setting up a successful data science strategy. Curious about which other pitfalls lurk? Then see the white paper ‘Win de race om de consument met data science’ (Dutch).