About
Name | Maarten Peters |
Title | Senior Principal Data Engineer/Scientist |
Nationality | Dutch |
Residence | Voorburg, The Netherlands |
maarten.peters@valcon.com | |
Mobile | +31 6 13680558 |
https://linkedin.com/in/maartenjpeters |
Career Overview
- 2022 - Present: Valcon - Senior Pricipal Data Engineer/Scientist
- 2018 - 2021: VIQTOR DAVIS - Senior Data Engineer
- 2017 - 2017: Jibes - Business Intelligence Consultant
- 2013 - 2016: Sanoma Media - Business Analyst
Education
- 2019 - 2022: Information Studies, track Data Science - University of Amsterdam
- 2008 - 2014: Communication & Multimedia Design - The Hague University of Applied Sciences
Certificates
- 2020 - 2020: Python: The Big Picture - Pluralsight.com
- 2017 - 2017: The Python Developer’s Toolkit - Pluralsight.com
- 2017 - 2017: Full Stack Web Development with Python (WEB2PY) - Pluralsight.com
- 2017 - 2017: Python: Getting Started - Pluralsight.com
- 2017 - 2017: Python Fundamentals - Pluralsight.com
- 2018 - 2018: Data Scientist with Python Track - Datacamp.com
- 2018 - 2018: Data Scientist with R Track - Datacamp.com
- 2018 - 2017: Data Analyst with Python Track - Datacamp.com
- 2017 - 2017: Data Analyst with R Track - Datacamp.com
- 2017 - 2017: Python Programmer - Datacamp.com
- 2017 - 2017: R Programmer - Datacamp.com
- 2018 - 2019: Deep Learning Specialization - Coursera.org
- 2018 - 2018: Convolutional Neural Networks - Coursera.org
- 2018 - 2018: Neural Networks and Deep Learning - Coursera.org
- 2015 - 2015: R Programming - Coursera.org
- 2015 - 2015: The Data Scientist’s Toolbox - Coursera.org
- 2017 - 2017: PSM-1 Scrum Master - Scrum.org
- 2022 - 2022: Intro to Splunk - Splunk.com
- 2022 - 2022: Scheduling Reports & Alerts - Splunk.com
- 2022 - 2022: Using Fields - Splunk.com
- 2022 - 2022: Visualizations - Splunk.com
- 2022 - 2022: What is Splunk - Splunk.com
- 2022 - 2022: Partner Training - Developer Foundations - Databricks.com
- 2022 - 2022: Academy Accreditation - Databricks Lakehouse Fundamentals - Databricks.com
- 2024 - 2025: Microsoft Certified: Azure Fundamentals - Microsoft.com
- 2024 - 2025: Microsoft Certified: Azure Data Fundamentals - Microsoft.com
- 2024 - 2025: Microsoft Certified: Azure Data Scientist Associate - Microsoft.com
Profile
Maarten Peters is a very flexible data scientist and engineer with a varied background in data applications. Starting his career as a Business Analyst at Sanoma Media B.V., he developed insights into real-time bidding data for the programmatic advertising department, collaborating with data science on revenue optimization. Moving to Jibes as a BI Consultant, he specialized his skills into data warehousing, engineering and transitioning into Viqtor Davis and Valcon as a Principal Data Engineer and Data Scientist, working with platforms such as Azure, Databricks and Python. His personal interests are focused on electronics, DIY and sports.
Project experience
2022 - 2022: IKEA - Data Engineer
As a Data Engineer at IKEA for the Sustainability project, I worked on the Retail footprint. Within this footprint I was mainly responsible for the design and implementation of the ETL processes within Azure Databricks. The data models followed a calculation model designed by IKEA, where the processes had to fit within an enterprise Data Mesh lakehouse architecture. Processes were developed in d.m.v. CI/CD, data quality testing, data lineage and schema validation. In the last weeks of the project, we also played a role in developing an NLP classification model for linking bills of materials
Responsibilites
- Design and implementation of ETL pipelines
- Analysis of incoming data
- Validation and monitoring of data quality
Tools
Agile/Scrum, Databricks, Azure DevOps, Azure Data Factory, CI/CD, Python, SQL
2021 - 2022: Schiphol - Data Engineer
Developing Python/Scala applications for the Central Data Factories team to ingest real-time data into the Core Data Platform, using Gitlab, Openshift, Databricks and Confluent, hosted on Microsoft Azure. Applications provided real-time data streaming, batch replay mechanisms and full CI/CD development with infrastructure-as-code in Terraform.
Responsibilites
- Collaboration on data ingestion pipelines with Data Science/Core Data Platform teams
- Setting up Azure & Kafka cloud infrastructure in Terraform
- Migration of Databricks applications for IoT smart building project
- Developing data ingestion/ETL applications for Databricks
- Maintaining and developing container applications in Openshift
Tools
Agile/Scrum, Databricks, Openshift, Kubernetes, Confluent, Kafka, Docker, Apache Airflow, Infrastructure-as-code, Terraform, Gitlab, CI/CD, Python, Scala, SQL
2022 - 2022: Santander - Data Scientist
Proof of concept solution of an investment appetite classification model, implemented in the low-code product of Squirro. Responsible for model development and validation.
Responsibilites
- Model development
- Validation
Tools
Agile/Scrum, Python, Squirro
2022 - 2022: LCPS - BI Consultant
Data warehouse development on COVID-19 patient transfer data and implementing Power BI reports for the Dutch government, providing insights into the pandemic, facilitating response measures.
Responsibilites
- Development of Azure Data Factory pipelines
- Setting up a model with calculations in DAX
- Data visualization
- Maintenance and development of the data warehouse
Tools
Azure Data Factory, Microsoft SQL Server, Microsoft Power BI, SQL, DAX
2020 - 2021: Stedin - Data Engineer
Collaborating on development of a big data platform, developing FastAPI applications deployed in Kubernetes containers and aiding in deployment of data science applications. All development was performed via CI/CD and tasks were delegated with Scrum.
Responsibilites
- Help develop the big data platform and improve stability.
- Development with CI/CD in Python for Kubernetes containers.
- REST API development with FastAPI
- Deployment of Data Science use cases.
Tools
Agile/Scrum, HDInsight, Apache Hive, Apache NiFi, Informatica Big Data Management, FastAPI, CI/CD, Python, Kubernetes, Docker, Azure Functions, Microsoft SQL Server, SQL
2019 - 2020: Fontem Ventures - Data Engineer
Implementing a lakehouse architecture for e-commerce analysis on DataBricks and Power BI, with infrastructure as code and Azure DevOps CI/CD pipelines.
Responsibilites
- Development of Azure cloud infrastructure for a Databricks & Power BI reporting environment
- Setup of ETL pipelines handling Google BigQuery data
Tools
Agile/Scrum, Python, Infrastructure-as-code, Terraform, Databricks, CI/CD, Azure DevOps, Azure Data Factory, Microsoft SQL Server, SQL, Microsoft Power BI
2019 - 2019: Port of Rotterdam - BI Consultant
Developing a data warehouse implementation for the OnTrack application in PostgreSQL, with ETL performed by AWS Lambda and infrastructure-as-code in Terraform.
Responsibilites
- Developing a data warehouse implementation in PostgreSQL
- Setting up ETL in AWS & Terraform
- Modelling data for reporting
Tools
Agile/Scrum, AWS Lambda, PostgreSQL, Python, R, SQL, Terraform, Infrastructure-as-code
2018 - 2019: Sligro - BI Consultant
Implementing a data warehouse in Microsoft SQL Server on-premise, with a data vault layer and a reporting data model.
Responsibilites
- Developing a data warehouse implementation in Microsoft SQL Server
- Designing and implementing a Data Vault
- Modelling data for reporting
Tools
Agile/Scrum, Microsoft SQL Server, SQL, Qlik Sense, Data Vault
2018 - 2018: PostNL - BI Consultant
As a BI Consultant I was responsible for developing reports in Power BI and developing data models in Microsoft SQL Server.
Responsibilites
- Report development in Microsoft Power BI
- Consulting in Power BI and Data Lake modelling
- Modelling data for reporting
Tools
Microsoft SQL Server, SQL, Power BI Desktop
2018 - 2018: VodafoneZiggo - BI Consultant
Report development and consulting in Qlik Sense, along with development on reporting data in Microsoft SQL Server.
Responsibilites
- Report development in Qlik Sense
- Consulting in Qlik Sense architecture and implementation
- Modelling data for reporting
Tools
Agile/Scrum, Microsoft SQL Server, Oracle SQL Developer, SQL, Qlik Sense Enterprise, Qlik Sense Desktop, Python
2017 - 2018: Connexxion - BI Consultant
Developed reports in Power BI, implemented a SQL Server data warehouse in Transact-SQL and SSIS, along with a data vault and reporting layer. Also developed a real-time Power BI dashboard, providing insights into sensor data from busses for maintainance.
Responsibilites
- Report development in Microsoft Power BI
- Administering, migration and implementation of the Power BI service
- SQL Server data warehouse development (T-SQL & SSIS)
- Set-up of App Workspaces
Tools
Agile/Scrum, Microsoft SQL Server, SQL Server Integration Services, SQL, Power BI, Team Foundation Server, Microsoft Power BI
2017 - 2017: Nationale Nederlanden - BI Consultant
Report development in Microsoft Power BI & SAP Business Objects, with data modelling in Oracle data warehouse. Also implemented R/Python scripts for automated on-premise data refresh.
Responsibilites
- Report development in Microsoft Power BI, SAP Business Objects
- Data modelling in Oracle
- R/Python development
Tools
Microsoft SQL Server, Oracle SQL Datawarehouse, SQL, SQL Server Integration Services, SAP Business Objects
2017 - 2017: Irdeto - BI Consultant
Development on the Microsoft SQL Server data warehouse, responsible for migrating the data ingestion of Microsoft Dynamics with SSIS and BIML.
Responsibilites
- Migration of Microsoft Dynamics CRM to 365 for SQL Server data warehouse
Tools
Microsoft SQL Server, SQL, SQL Server Integration Services, Tableau
2017 - 2017: PostNL - BI Consultant
As a BI Consultant at PostNL, I developed reports in Amazon Quicksight based on a self-developed data warehouse based on Amazon Redshift and AWS Lambda, as a result of a migration from Microsoft Access and VBScript.
Responsibilites
- Report development in Amazon Quicksight
- Python development for AWS Lambda
- Redshift database development and administering
- Microsoft Access and VBScript development
Tools
Microsoft Access, AWS Redshift, AWS Lambda, Data modelling, SQL, PostgreSQL, Python
2013 - 2016: Sanoma Media - Business Analyst
At Sanoma Media B.V. there was a growing focus on the digital ad marketplace where more and more attention was given to programmatic advertising. Via real-time bidding (RTB) ads were served and increasingly larger volumes of data were collected. To visualize this information a big data environment was developed with Hadoop/Apache Hive as the data lake and QlikView as the reporting tool.
I was responsible for the development of queries to Apache Hive and designing ETL processes for data acquisition to QlikView. Also I developed the front-end of almost all reports with varying degrees of report flexibility, where users could select, compare and analyze almost all possibilities.
Responsibilites
- Report development in QlikView
- Developing ETL via QlikView Management Console and Apache Hive
- Apache Hive data lake development
- Sales reporting and analysis
Tools
Agile/Scrum, R, QlikView, Apache Hive, Data modelling, SQL, PHP, HTML5, AWS, CSS, JavaScript, GitHub, Bash
Skills and expertise
Key skills
Coding
Python | ★★★★ |
Scala | ★★★☆ |
SQL | ★★★★ |
Bash | ★★☆☆ |
R | ★★☆☆ |
Tools
Kafka | ★★☆☆ |
Databricks | ★★★★ |
iPython | ★★★☆ |
Git | ★★★☆ |
Kubernetes | ★★☆☆ |
Microsoft SQL Server | ★★★★ |
Microsoft Power BI | ★★★★ |
Consulting
Business Analytics | ★★★☆ |
Capability Building | ★☆☆☆ |
Project Acquisition | ★☆☆☆ |
Project Definition | ★★☆☆ |
Project Management | ★★★☆ |
Solution Design | ★★★☆ |
Stakeholder Management | ★★☆☆ |
Organisation
Administration | ★★★☆ |
Domain Understanding | ★★☆☆ |
Conceptual Reasoning | ★★★☆ |
Team Management | ★★☆☆ |
Data & techniques
Structured Data: CSV, JSON, Parquet, Delta, etc. | ★★★★ |
Unstructure Data: Web pages, PDF, images, etc. | ★★☆☆ |
Databases | ★★★☆ |
API’s | ★★☆☆ |
Data Warehousing | ★★★★ |
Report Development | ★★★★ |
Data Modelling | ★★★☆ |
Machine Learning | ★★☆☆ |
Proficiency levels:
- ☆☆☆☆: No experience
- ★☆☆☆: Basic Experience - average practical proficiency
- ★★☆☆: On good theoretical and practical level. Can work independent
- ★★★☆: Above average in experience and knowledge. Ability to lead and coach
- ★★★★: Expert. Ability to architect, consult and advice end-to-end