Our Company
We’re Hitachi Digital Services, a global digital solutions and transformation business. Our expertise, innovation and technology unlocks potential – we take theme park fans on magical rides, conserve natural resources, protect rainforests, and save lives. We automate, modernize, optimize, and accelerate. Our people are trusted transformers, with deep engineering expertise, focused on a sustainable future for all.
Imagine the sheer breadth of talent it takes to inspire the future. We don’t expect you to ‘fit’ every requirement – your life experience, character, perspective, and passion for achieving great things in the world are equally important to us.
About us
We’re a global team of innovators. Together, we harness engineering excellence and passion for insight to co-create meaningful solutions to complex challenges. We turn organizations into data-driven leaders that can a make positive impact on their industries and society. If you believe that innovation can inspire the future, this is the place to fulfil your purpose and achieve your potential.
Role Summary:
Solution Design & Architecture
- Architect scalable, high-performance data platforms with hybrid-cloud deployments on Azure and AWS.
- Define data ingestion, transformation, and storage strategies leveraging tools such as NiFi, Airflow, Pentaho, and Talend.
- Design multi-tenant data models, metadata management, and governance frameworks to ensure data integrity and security.
- Lead the migration of legacy analytics environments (SAP BW, SAS) to modern data architectures (Snowflake, PySpark, Azure Synapse, Hadoop, Cloudera).
- Implement streaming data pipelines for real-time data ingestion using Kafka, MQTT, and IoT-based architectures.
Delivery & Implementation
- Provide technical leadership and hands-on execution in data engineering, ensuring optimal performance tuning of ETL pipelines.
- Lead large-scale data migration and cloud transformation projects, including SAS-to-PySpark and Hadoop-to-Snowflake transitions.
- Oversee data quality and governance implementations, leveraging tools like Atlas and Lumada Data Catalog.
- Design and optimize data lakes and warehouses using HP Vertica, Postgres, MongoDB, Cloudera, and Oracle.
- Establish CI/CD and automation best practices for data pipeline deployment and monitoring.
Presales & Client Engagement
- Engage with C-level executives, IT leaders, and business teams to define data transformation roadmaps.
- Lead solution demonstrations, technical deep-dives, and Proof of Concepts (PoCs) for cloud and data analytics solutions.
- Develop RFP/RFI responses, creating detailed architecture blueprints and migration strategies.
- Provide strategic consulting in BFSI, Telecom, and Healthcare industries, leveraging domain expertise.
Technology & Innovation
- Drive emerging technology adoption in IoT-driven smart city solutions, integrating computer vision and AI analytics.
- Develop and implement fraud detection algorithms for payment processing systems using advanced analytics techniques.
- Ensure solutions align with regulatory compliance standards (GDPR, HIPAA, PCI DSS, etc.).
Qualifications And Key Technical Skills:
Core Expertise (Must-Have)
- 10+ years of relevant experience in data architecture, cloud computing, and big data solutions
- Hands-on experience in MAPR and Pentaho, with deep technical expertise in Pentaho PDI, C-Tools, and Cloudera ecosystems.
- Extensive hands-on experience in NiFi, Snowflake, Airflow, Azure, and AWS, with proven implementation expertise.
- Experience in multi-cloud hybrid architectures, particularly Azure-based ETL workflows with ADF and Talend.
- Strong background in IoT, real-time analytics, and event-driven data processing, using Kafka, MQTT, and WSO2.
- Expertise in financial data platforms, payment processing systems, and fraud detection analytics.
- Ability to architect, optimize, and migrate large-scale analytics platforms (SAP BW, SAS, Hadoop, PySpark).
- Hands-on experience with data governance and cataloging tools (Atlas, Lumada Data Catalog).
- Strong knowledge of performance tuning in distributed computing environments (HP Vertica, Postgres, MongoDB, Oracle, Cloudera).
- Experience in leading cross-functional teams, ensuring seamless delivery execution.
Preferred Skills
- Experience in healthcare IT systems, enterprise architecture, and data security frameworks.
- Strong knowledge of DevOps for data (Terraform, Kubernetes, HashiCorp Vault, Prometheus, Grafana).
- Exposure to computer vision and AI-driven smart city initiatives.
- Prior experience in managing payment processing and collections platforms using API-driven architectures.
Education & Certifications
- Master’s degree in Computer Science, Engineering, or related field.
- Relevant cloud certifications (AWS Certified Solutions Architect, Azure Solutions Architect, Snowflake SnowPro, etc.).