As a Data Platform Engineer, you’ll drive the design, development, and maintenance of our unified data environment. In this role, you’ll ensure our data infrastructure is reliable, efficient, secure, and scalable, providing a strong foundation for teams across the organization to access and utilize data effectively. You’ll collaborate closely with Data Engineers, Data Security, and IT Infrastructure teams to build a robust, scalable, and flexible environment that meets the needs of a diverse group of users.
What you’ll do
- Take responsibility for the design, development and maintenance of our company's data platforms and technology stack.
- Ensure that data and production systems are highly reliable, available, efficient, secure, and scalable.
- Provision data environments using infrastructure as a code (IaC).
- Identify areas for improvement (e.g., cost optimization, deployment processes) and provide technical hands-on assistance.
- Foster a culture of eliminating infrastructure related incidents by implementing optimal observability and monitoring.
- Work closely with our Data Engineers, Data Security and IT infrastructure Teams to provision secure, scalable and flexible data environments for various groups of users.
Our Tech Stack: Terraform, Databricks, Spark, Python, Azure Cloud, Power BI
What you’ll need
- 5+ years of experience as a DevOps or Platform Engineer.
- University degree (preferably a master's degree) in Computer Science or any related field.
- Strong hands-on experience in provisioning secure, scalable and performant data platforms with IaC (Terraform) in the cloud (preferably Azure Databricks).
- Expert knowledge in data infrastructure & engineering best practices – especially in the context of security.
- Strong hands-on experience with infrastructure as a code (IaC), GitOps & DataOps solutions and industry standards.
- Knowledge about virtualization & containerization, hands-on experience with Kubernetes and Docker.
- Knowledge about key components of data platform stack for handling structured and unstructured data.
Bonus Points:
- Programming skills in Python or other scripting languages.
- Experience with deploying and operationalization of machine learning models (ML Ops).
- Knowledge in developing, deploying and performance tuning of ELT pipelines using data pipeline orchestration tools in a distributed system.
What you’ll bring
- Strong knowledge of cloud infrastructure, data platforms and automation.
- Excellent communication, collaboration, and problem-solving skills.
- Passion for data and customer-centric solutions.
- Ability to actively participate in discussions with different technical and non-technical teams.