minden.ai is a technology venture founded by Temasek in strategic partnership with DFI Retail Group and coalition partners BreadTalk Group, DBS Bank, PAssion Card, Mandai Wildlife Group, Singtel, GoJek, FoodPanda and Great Eastern. We are on a mission to redefine the engagement between brands and consumers in Southeast Asia.
The way we work.
At minden.ai, our culture is the foundation of everything we do. We believe in the power of teamwork and collaboration. We are continuous learners and commit to stay ahead of the curve to drive positive change in our industry. We are seeking individuals with grit and passion, an insatiable intellectual curiosity, and a heart for people, to amplify our effectiveness as a team.
As a Senior Data Engineer, you will:
Contribute to the design and implementation of a scalable and reliable data platform based on the data mesh principles with modern data stack.
Develop and maintain efficient data pipelines to ingest, transform, and store data from various sources.
Collaborate with engineers across domains to understand their data needs and develop self-service data platform features, improve developer experience, build data access tools and APIs.
Contribute to the development and improvement of internal Data Products/Data Marts, Data Modelling and Data Product’s Usability.
Partner with the data science and analytics teams to ensure their data needs are met, and other upstream customers for the data platform.
Follow and contribute to the continuous improvement of data standards and best practices, and full life cycle of Data Platform.
What you should have:
Bachelor's degree in Computer Science, Data Science, or related field.
5+ years of experience as a Data Engineer with Data Modelling experience; additional experience as backend software engineer is preferred.
Proficiency in programming languages like SQL, Python, and Java or Scala, and experience with Big Data processing frameworks like Apache Spark or data warehouses.
Experience with dimensional modelling, Data Warehouse concepts, SCDs and tradeoffs, modern data warehouse, and data lake, lake house, data mesh.
Experience with modern data stack (eg. Snowflake, dbt, Airflow, etc) or similar technologies is a plus.
Experience in Building, Productionising and Maintaining Data Lifecycle management and relevant tools.
Understanding of data mesh principles and concepts is beneficial.
Experience working in an ambiguous, low control yet fast-paced startup environment.