Proven experience in data engineering, data integration, and data pipeline development.
Strong knowledge of DevOps principles and tools, including CI/CD pipelines, containerization (e.g., Docker), and infrastructure as code (e.g., Terraform).
Proficiency in scripting and programming languages (e.g., Python, Bash) for automation.
Experience with cloud platforms (e.g., AWS, Azure, GCP) and container orchestration (e.g., Kubernetes).
Understanding of data storage solutions, databases (e.g., SQL, NoSQL), and distributed computing frameworks.
Knowledge of data governance, data quality, and data security best practices and tools such as datahub, openmetadata etc
DataOps certifications and knowledge of tools like Apache Nifi, Apache Airflow, or similar is a plus.