A scalable data platform in Azure
TM Forum
- Customer case
- Data Engineering
- Data projects


TM Forum, an alliance of over 850 global companies, engaged our company as a data partner to identify and solve data-related challenges.
The organisation was facing issues with their data ingestion process. The tool they were using was a low-code solution with limited available resources, and most Data Engineers were not familiar with it. Additionally, the tool was not used in other departments within TM Forum, making it difficult to transfer knowledge.
To address these issues, we worked with TM Forum to set up a scalable and future-proof data platform in Microsoft Azure. This cloud provider was already in use in several departments within TM Forum, and Azure has good integrations with Databricks for big data processing via Spark. We set up the data ingestion and data processing using Python code, which is transferable and in line with the skills of the platform's users.
Our approach
Our approach included a Proof of Concept, where we rebuilt one of the data pipelines in Microsoft Azure to test the feasibility of our solution. Within a few weeks, we were able to recreate all functionality of this data pipeline and set up a future-proof structure.
In this process, we used infrastructure as code via Terraform to create reusable modules. This approach provided better insight into platform status, clear documentation, and version management. Also, it allowed for easy rollbacks in case of pipeline failure. Find more details on the (dis)advantages of Infrastructure as Code in our blog.
We built the data pipelines in four steps:
- Loading data: via a standard integration (Salesforce) or a custom API connection (Python code in Databricks), we loaded the data
- Transforming data: we transformed the data, enriched it, aligned definitions, and renamed columns.
- Checking data quality: we implemented data quality checks based on client definitions, ensuring that only data meeting the quality standards was included in the platform.
- Pushing data to the data warehouse in Snowflake: Analysts at TM Forum were able to work with the data and bring valuable insights

Result
All data pipelines are now secured in Azure, managed, and monitored via Data Factory. The tool checks daily if all data is properly retrieved, transformed, and verified. This ensures that analysts have access to high-quality data in Snowflake and allows them to extract relevant insights.
Future
In future, we will work with TM Forum to identify which features in the cloud platform can be leveraged for analysis. In addition, business logic will be implemented on top of the ingested raw data, before it’s served to the business, for example via dashboards. We will also make the management of the platform easily transferable to the internal organization so that they can maintain and expand it themselves.
Want to know more?
Ian will be happy to talk to you about what we can do for you and your organisation as a data partner.
Business Manager+31(0) 20 308 43 90+44(0) 77 8654 1313ian.gardiner@digital-power.com
Receive data insights, use cases and behind-the-scenes peeks once a month?
Sign up for our email list and stay 'up to data':