We are seeking a motivated and enthusiastic Data Engineer to join our dynamic team. This full-time position offers hands-on experience in data engineering, allowing you to work on real projects and gain valuable industry knowledge.
Responsibilities
Design and implement data pipelines to manage data movement from various sources (databases, APIs, etc.) into storage solutions.
Develop and manage ETL processes using dbt for efficient data transformation.
Optimize data pipelines for efficiency and scalability using techniques within the chosen data processing technology and enhance storage solutions for optimal performance.
Design, implement, maintain, and improve CI/CD pipelines for data processes. This includes automating testing, deployment, and configuration management for data infrastructure.
Collaborate with data scientists and business analysts to design and implement data pipelines.
Develop efficient data processing code that adheres to specifications.
Integrate data from various sources and third-party applications.
Test and implement data processing systems and workflows.
Identify, troubleshoot, and enhance existing data processes.
Collect and assess data input from various sources.
Propose and implement data enhancements and optimizations.
Generate technical documentation for reference and reporting.
Additional Value Added Skills
Experience with any DevOps, GitHub tools and methodologies.
Understanding of cloud security, governance best practices in Akamai / Azure / AWS / GCP.
Minimum Requirements
Must be a Malaysian citizen.
Currently holding a degree in Computer Science, or Computer Engineering.
Full coverage of travel expenses for candidates open to traveling, with full benefits from the Company included.
Ability to work independently in a dynamic and complex environment.
Strong communication skills (both verbal and written) – proficiency in English is essential.
Extensive knowledge in data, data engineering, and related subdomains.
Experience with cloud technologies, cloud-native applications, and data flows.
Proficient in MS SQL, PostgreSQL, Change Data Capture (CDC), data streaming applications (e.g., Apache Kafka, Qlik Replicate), and related systems.