Using MLOps for fully automated and reliable sales forecasting
Global asset manager
- Customer case
- Data Engineering
- Data consultancy
A global asset manager, specialising in Quant and Sustainable Investing, offers a range of investment strategies, including equities and bonds. To strengthen their competitive position and proactively respond to changing client needs and market developments, the sales and marketing department aimed to adopt a more data-driven approach.
The team conducted data analyses to answer ad-hoc queries, and a data scientist developed a machine learning model to predict sales opportunities. However, this model was hosted on the data scientist's laptop, which caused the forecasts on the dashboard to quickly become outdated and required a lot of manual work to update. While this was a solid first step, it was not a sustainable, future-proof solution.
To create an automated system that could generate periodic forecasts and send them directly to the dashboard, they enlisted the help of one of our Data Engineers.
Approach
We worked alongside the asset manager’s data scientist through coaching sessions. Our goal was to automate and future-proof the existing sales forecasting model using MLOps best practices. This ensured that the model would continue to operate in the future and allowed for easier integration of new models. Since the data scientist was closely involved throughout the process and implemented many aspects with coaching, the knowledge and MLOps methodology would remain embedded within the company. The process involved the following steps:
1. Automating source data: The first task was to automate the retrieval of data used for training the model from the source. A challenge here was that the asset manager was simultaneously restructuring their data warehouse, so we couldn’t directly connect to it. Instead, we temporarily automated the upload of data from the analytics system to Azure. Once the new data warehouse is ready, the model will be able to connect to it directly.
2. Setting up pipelines: Next, we created two pipelines in Azure ML and Azure DevOps. The first pipeline was designed for training the model, and the second for generating predictions. The second pipeline also ensures that the data automatically reaches the appropriate location for the dashboard.
3. Rewriting code: We rewrote the code to make it suitable for automation and for use in the two pipelines. We also placed the code in a Git repository to enable version control and CI/CD.
We set up the automation according to the MLOps method, making it suitable for multiple models and future-proof. To do this, we made the boilerplate code generic so that both the code and configuration could be easily reused for other models. This promotes consistency, speeds up the development of new models, and ensures that all models will run in a stable environment in the future.
Result
The model is now fully automated and future-proof. Predictions are generated automatically and immediately integrated into the dashboard, saving a significant amount of manual work. The only remaining manual process is the approval of the model after training. This final review by a person remains essential to ensure quality.
Additionally, the data scientist has gained considerable knowledge in data engineering and MLOps. This enables her to apply these skills more independently in the future, ensuring that the expertise remains within the organisation.
Future
In the future, we will continue to collaborate with the asset manager’s data scientist to develop a new model for another application. As this model will need to be developed from scratch, we will begin with data science and then integrate it into the same structure for automation. Since the first model was set up using the proper MLOps approach, this process will be significantly more efficient.
Want to know more?
Joachim will be happy to talk to you about what we can do for you and your organisation as a data partner.
Business Manager+31(0)20 308 43 90+31(0)6 23 59 83 71joachim.vanbiemen@digital-power.com
Receive data insights, use cases and behind-the-scenes peeks once a month?
Sign up for our email list and stay 'up to data':