Hiring Local Malaysians only - CONTRACT JOB - 12 MONTHS (EXTENDABLE)
DATA ENGINEER - IT with Extensive experience in Healthcare.
With Minimum Experience - 3 years & Maximum Experience - 15 years
Core responsibilities include:
Work within a highly specialized and growing team to enable delivery of data and advanced analytics system capability.
Develop and implement a reusable architecture of data pipelines to make data available for various purposes including Machine Learning (ML), Analytics and Reporting.
Work collaboratively as part of a team engaging with system architects, data scientists, and business in a healthcare context.
Define hardware, tools, and software to enable the reusable framework for data sharing and ML model productionization.
Work comfortably with structured and unstructured data in a variety of different programming languages such as SQL, R, Python, Java, etc.
Understanding of distributed programming and advising data scientists on how to optimally structure program code for maximum efficiency.
Build data solutions that leverage controls to ensure privacy, security, compliance, and data quality.
Understand metadata management systems and orchestration architecture in the designing of ML/AI pipelines.
Deep understanding of cutting-edge cloud technology and frameworks to enable Data Science.
System integration skills between Business Intelligence and source transactional systems.
Define strategies with Data Scientists to monitor models post production.
Write unit tests and participate in code reviews.
Improving overall production landscape as required:
Honours or Master’s degree in BSc Computer Science or Engineering or Software Engineering with solid experience in data mining and machine learning.
5 to 15 years of work experience.
Expert in programming languages such as R, Python, Scala, and Java.
Expert database knowledge in SQL and experience with MS Azure tools such as Data Factory, Synapse Analytics, Data Lake, Databricks, Azure Stream Analytics, and PowerBI.
Modern Azure data warehouse skills.
Expert Unix/Linux admin experience including shell script development.
Exposure to AI or model development.
Experience working on large and complex datasets.
Understanding and application of Big Data and distributed computing principles (Hadoop and MapReduce).
ML model optimization skills in a production environment.