Senior Data Engineer

Be among the first applicants.
Menrva
Singapore
SGD 60,000 - 80,000
Be among the first applicants.
5 days ago
Job description

Job Overview

The Senior Data Engineer will be responsible for requirements gathering, solutioning, designing and building modern data platforms to support data-driven decision making. The Engineer will execute technical implementation of data engineering and visualization projects, and will be a hands-on role.

The Senior Data Engineer will help build a data and analytics consulting practice by taking part in recruiting efforts, creating technical collateral, and staying on top of technology trends with ongoing training and certifications.

Key Responsibilities

  1. Lead and drive discovery sessions with external clients and build state-of-the-art data architectures. Work with stakeholders to understand their problem statements, data requirements and implement solutions that meet their needs.
  2. Design, implement, and develop data pipelines to collect and process large amounts of data from various sources.
  3. Implement data storage solutions that are scalable, secure, and efficient, such as data warehouses and databases.
  4. Develop and implement data validation and testing processes to ensure that data is processed accurately and efficiently.
  5. Automate data collection, processing, and reporting processes to minimize manual work and improve efficiency.
  6. Create high quality documents to capture problem statements, requirements, solutions and designs.
  7. Support pre-sales activities, including whiteboard sessions, collaborating on solution architecture design, and assisting in proposal and statement of work creation.
  8. Contribute to the development of reusable, repeatable collateral for use across the practice.
  9. Obtain / maintain training and certification in cloud technologies.
  10. Work with the marketing team to produce content to promote the practice across the region.

Qualifications & Key Requirements:

Must Have:

  1. Bachelor's degree in Computer Science, Information Technology, or a related field.
  2. 7-9 years of data engineering experience in data management, database architecture, data engineering, or data visualization.
  3. Excellent problem-solving, organization, debugging, and analytical skills.
  4. Ability to work independently and in a team environment.
  5. Excellent communication skills for effectively expressing ideas to team members and clients.
  6. Strong experience in integrating with multiple data sources with both structured and unstructured data in both batch and streaming modes.
  7. Knowledge of cloud computing platforms, such as Amazon Web Services (AWS), Google Cloud Platform (GCP), or Microsoft Azure. AWS skills preferred.
  8. Experience building data pipelines with ETL tools/Equivalent Cloud services, such as Azure Data Factory, AWS Glue or equivalent.
  9. Familiarity with data warehousing solutions, such as Snowflake, Google BigQuery, Databricks & Azure Synapse. Snowflake skills preferred.
  10. Experience with open source data formats such as Apache Iceberg, Hudi, Delta Lake.
  11. Experience with DBT (Data build tool) for transformations.
  12. Experience with at least one programming language, such as Python, Java, or Scala.
  13. Experience in at least one RDBMS (MSSQL/MySQL/Oracle/PostgreSQL).
  14. Experience in at least one NoSQL database (MongoDB/Cassandra/BigTable).
  15. Strong experience in orchestration tools such as Apache Airflow/Nifi/equivalent cloud native services. Apache Airflow preferred.
  16. Strong skills in Apache Spark/Flink/Beam.
  17. Experience in designing, implementing, and managing event-driven data pipelines using Kafka or similar cloud-native streaming services like AWS Kinesis, Azure Event Hubs, or Google Cloud Pub/Sub.
  18. Familiarity with Docker and Kubernetes.
  19. Debug and optimize existing data infrastructure and processes as needed.
  20. Familiarity with version control systems, particularly Git, for managing code repositories, branching strategies, and collaborative development workflows.

Nice To Have:

  1. Experience building large-scale, high throughput, 24x7 data systems.
  2. Experience with visualization tools, such as Power BI, Looker, Tableau, and QuickSight.
  3. Any data engineering, visualization, or data science certifications on any of the clouds.
  4. Exposure to machine learning algorithms, AI, and/or LLM, with implementation in practice.
  5. Experience with legacy data systems (e.g. Hadoop, Informatica).
  6. Experience with CI/CD concepts and tools for automating data pipeline deployments, testing, and version control, such as Jenkins, GitLab CI, Azure DevOps, or similar platforms.
  7. Experience with Data Quality frameworks and Data Governance tools such as Alation/Collibra/DataPlex.
Get a free, confidential resume review.
Select file or drag and drop it
Avatar
Free online coaching
Improve your chances of getting that interview invitation!
Be the first to explore new Senior Data Engineer jobs in Singapore