Your Data Engineering partner

Data Engineer at work

Generate reliable and meaningful insights from a solid, secure and scalable infrastructure. Our team of 25+ Data Engineers is ready to implement, maintain and optimise your data products and infrastructure end-to-end.

Data Engineering challenges we can solve for you

image of euros

Fast and reliable internal information using AI Document Explorer

Financial institutions need to process large amounts of documentation. For this particular institution, an internal team facilitates this by, for example, creating summaries using text analysis and natural language processing (NLP). They make these available to the various business units. To conduct audits more efficiently, they wanted to develop a question-and-answer model to get the right information to them faster. When ChatGPT was launched, they asked us to create a proof of concept.

Read more
kadaster header

Working more efficiently thanks to migration to Databricks

The Kadaster manages complex (geo)data, including all real estate in the Netherlands. All data is stored and processed using an on-premise data warehouse in Postgres. They rely on an IT partner for maintaining this warehouse. The Kadaster aims to save costs and work more efficiently by migrating to a Databricks environment. They asked us to assist in implementing this data lakehouse in the Microsoft Azure Cloud.

Read more
iphone with spotify music

Converting billions of streams into actionable insights with a new data & analytics platform

Merlin is the largest digital music licensing partner for independent labels, distributors, and other rightsholders. Merlin’s members represent 15% of the global recorded music market. The company has deals in place with Apple, Facebook, Spotify, YouTube, and 40 other innovative digital platforms around the world for its’ member’s recordings. The Merlin team tracks payments and usage reports from digital partners while ensuring that their members are paid and reported to accurately, efficiently, and consistently.

Read more
elevator

20% fewer complaints thanks to data-driven maintenance reports

An essential part of Otis's business operations is the maintenance of their elevators. To time this effectively and proactively inform customers about the status of their elevator, Otis wanted to implement continuous monitoring. They saw great potential in predictive maintenance and remote maintenance.

Read more
business managers having a conversation

Insight into the complete sales funnel thanks to a data warehouse with dbt

Our consultants log the assignments they take on for our clients in our ERP system AFAS. In our CRM system HubSpot, we can see all the information relevant before signing a collaboration agreement. When we close a deal, all the information from HubSpot automatically transfers to AFAS. So, HubSpot is mainly used for the process before entering a collaboration, while AFAS is used for the subsequent phase. To tighten our people's planning and improve our financial forecasts, we decided to set up a data warehouse to integrate data from both data sources.

Read more
potatoes

Valuable insights from Microsoft Dynamics 365

Agrico is a cooperative of potato growers. They cultivate potatoes for various purposes such as consumption and planting future crops. These potatoes are exported worldwide through various subsidiaries. All logistical and operational data is stored in their ERP system, Microsoft Dynamics 365. Due to the complexity of this system with its many features, the data is not suitable for direct use in reporting. Agrico asked us to help make their ERP data understandable and develop clear reports.

Read more
woman shopping online

A standardised way of processing data using dbt

One of the largest online shops in the Netherlands wanted to develop a standardised way of data processing within one of its data teams. All data was stored in the scalable cloud data warehouse Google BigQuery. Large amounts of data were available within this platform regarding orders, products, marketing, returns, customer cases and partners.

Read more
dutch highway

Reliable reporting using robust Python code

The National Road Traffic Data Portal (NDW) is a valuable resource for municipalities, provinces, and the national government to gain insight into traffic flows and improve infrastructure efficiency.

Read more
valk exclusief

Setting up a future-proof data infrastructure

Valk Exclusief is a chain of 4-star+ hotels with 43 hotels in the Netherlands. The hotel chain wants to offer guests a personal experience, both in the hotel and online.

Read more
data platform

A scalable data platform in Azure

TM Forum, an alliance of over 850 global companies, engaged our company as a data partner to identify and solve data-related challenges.

Read more

A fully automated data import pipeline

Stichting Donateursbelangen aims to strengthen trust between donors and charities. They believe that that trust is based on collecting money honestly, openly, transparently and respectfully. At the same time effectively using the raised donation funds to make an impact. To further this goal, Stichting Donateursbelangen wants to share information about charities with donors through their own search engine.

Read more
data mental healthcare

Central data storage with a new data infrastructure

Dedimo is a collaboration of five mental healthcare initiatives. In order to continuously enhance the quality of their care, they organize internal processes more efficiently. Therefore, they use perceptions from the data that is internally available. Previously, they acquired the data themselves from different source systems with ad hoc scripts. They requested our help to make this process more robust, efficient and to further professionalise it. They asked us to facilitate the central storage of their data, located in a cloud data warehouse. The goal was to set up the data infrastructure within this environment, since they were already used to working with Google Cloud Platform (GCP).

Read more
lake

Improved data quality thanks to a new data pipeline

At Royal HaskoningDHV, the number of requests from customers with Data Engineering issues continue to climb. The new department they have set up for this, is growing. So they asked us to temporarily offer their Data Engineering team more capacity. One of the issues we offered help with involved the Aa en Maas Water Authority.

Read more
fysioholland data

A well-organised data infrastructure

FysioHolland is an umbrella organisation for physiotherapists in the Netherlands. A central service team relieves therapists of additional work, so that they can mainly focus on providing the best care. In addition to organic growth, FysioHolland is connecting new practices to the organisation. Each of these has its own systems, work processes and treatment codes. This has made FysioHolland's data management large and complex.

Read more
billboards

A scalable machine-learning platform for predicting billboard impressions

The Neuron provides a programmatic bidding platform to plan, buy and manage digital Out-Of-Home ads in real-time. They asked us to predict the number of expected impressions for digital advertising on billboards in a scalable and efficient way.

Read more

Measurable impact on social change using a data lake

RNW Media is an NGO that focuses on countries where there is limited freedom of expression. The organisation tries to make an impact through online channels such as social media and websites. To measure that impact, RNW Media drew up a Theory of Change (a kind of KPI framework for NGOs).

Read more

Q&A about Data Engineering

Read how we approach projects and what our vision of the modern Data Engineer is.

schedule an online meeting

Trust your data

Do you want to start collecting data but don't know where to start? Have you already started but need assistance with monitoring your processes and supervising your quality? Or do you feel that the data you collect does not contribute optimally to achieving your goals? Joachim will gladly discuss the solutions with you.

Be inspired about Data Engineering

Low-code/no-code or custom coding?

Years ago, you couldn't develop an application or process without knowledge of complex programming languages like Javascript, PHP, and Python. You needed a programmer or Data Engineer. Today, there is a shortage of technical experts, while more and more low-code solutions are appearing on the market. These tools allow you to get started without in-depth technical knowledge. Whether this is the right solution for you depends on various factors. Make the right decision with the help of this article.

Read more

The organisational benefits of implementing your own AI-chatbot

With the increasing availability of cloud services that enable companies to leverage Large Language Models, it becomes relatively easy to setup your own GPT-model. However, one important question needs to be answered before you start building: what are the benefits for my organisation?

Read more
ai document explorer example

How does the AI Document Explorer work in practice?

The AI Document Explorer (AIDE) is a cloud solution developed by Digital Power that utilises OpenAI's GPT model. It can be deployed to quickly gain insights into company documents. AIDE securely indexes your files, enabling you to ask questions about your own documents. Not only does it provide you with the answers you are looking for, but it also references the locations where these answers are found.

Read more
colleagues talking to each other

Bring structure to your data

There are many different forms of data storage. In practice, a (relational) database, a data warehouse, and a data lake are the most commonly used and often confused with each other. In this article, you will read about what they entail and how to use them.

Read more

Data quality: the foundation for effective data-driven work

Data projects often need to deliver results quickly. The field is relatively new, and to gain support, it must first prove its value. As a result, many organisations build data solutions without giving much thought to their robustness, often overlooking data quality. What are the risks if your data quality is not in order, and how can you improve it? Find the answers to the key questions about data quality in this article.

Read more
people working together

The all-round profile of the modern data engineer

Since the field of big data emerged, many elements of the modern data stack became the data engineers' responsibility. What are these elements, and how should you build your data team?

Read more
explanation about analytics engineering

Unlocking the power of Analytics Engineering

The world of data is continuously shifting and so are its corresponding jobs and responsibilities within data teams. With this, an up-and-coming role appeared on the horizon: the Analytics Engineer.

Read more

What is machine learning operations (MLOps)?

Bringing machine learning models to production has proven to be a complex task in practice. MLOps assists organisations that want to develop and maintain models themselves in ensuring the quality and continuity. Read this article and get answers to the most frequently asked questions on this topic.

Read more

What is a data architecture?

Working in a data-driven way helps you make better decisions. The better your data quality, the more you can rely on it. A good data architecture is a basic ingredient for data-driven working. In this article, we explain what a data architecture is and what a Data Architect does.

Read more

The foundation for Data Engineering: solid data pipelines

Basically, Data Engineers work on data pipelines. These are data processes that can retrieve data from a certain place and write it in somewhere. In this article you can read more about how data pipelines work and discover why they are so important for a solid data infrastructure.

Read more
Data Engineer and ML Engineer talking to each other

What does a (Cloud) Data Engineer do versus a Machine Learning Engineer?

In the world of data and technology, Data Engineers and Machine Learning Engineers are crucial players. Both roles are essential for designing, building, and maintaining modern data infrastructures and advanced machine learning (ML) applications. In this blog, we focus specifically on the roles and responsibilities of a Data Engineer and Machine Learning Engineer.

Read more

Why do I need Data Engineers when I have Data Scientists?

It is now clear to most companies: data-driven decisions by Data Science add concrete value to business operations. Whether your goal is to build better marketing campaigns, perform preventive maintenance on your machines or fight fraud more effectively, there are applications for Data Science in every industry.

Read more
implementing a data platform

Implementing a data platform

Based on our know-how, the purpose of this blog is to transmit our knowledge and experience to the community by describing guidelines for implementing a data platform in an organisation. We understand that the specific needs of every organisation are different, that they will have an impact on the technologies used and that a single architecture satisfying all of them makes no sense. So, in this blog we will keep it as general as we can.

Read more

5 reasons to use Infrastructure as Code (IaC)

Infrastructure as Code has proven itself as a reliable technique for setting up platforms in the cloud. However, it does require an additional investment of time from the developers involved. In which cases does the extra effort pay off? Find out in this article.

Read more

Setting up Azure App functions

In the article, we start by discussing Serverless Functions. Then we demonstrate how to use Terraform files to simplify the process of deploying a target infrastructure, how to create a Function App in Azure, the use GitHub workflows to manage continuous integration and deployment, and how to use branching strategies to selectively deploy code changes to specific instances of Function Apps.

Read more

AWS (Amazon Web Services) vs GCP (Google Cloud Platform) for Apache Airflow

This article provides a comparison between these two managed services Cloud Composer & MWAA. This will help you understand the similarities, differences, and factors to consider when choosing them. Note that there are other good options when it comes to hosting a managed airflow implementation, such as the one offered by Microsoft Azure. The two being compared in this article are chosen due to my hands-on experience using both managed services and their respective ecosystems.

Read more

Kubernetes-based event-driven autoscaling with KEDA: a practical guide

This article explains the essence of Kubernetes Event Driven Autoscaling (KEDA). Subsequently, we configure a local development environment enabling the demonstration of KEDA using Docker and Minikube. Following this, we expound upon the scenario that will be implemented to showcase KEDA, and we guide through each step of this scenario. By the end of the article, you will have a clear understanding of what KEDA entails and how they can personally implement an architecture with KEDA.

Read more