Are you a skilled Data Streaming Engineer passionate about real-time data processing and cutting-edge technology? Join our PBT team and play a pivotal role in shaping data streaming architecture.
This position offers the opportunity to work with industry-leading tools like Kafka and Flink, collaborating with talented development and data engineering teams to deliver efficient and scalable solutions.
Key Responsibilities:
Build and Maintain: Design, develop, and manage robust streaming pipelines using Kafka and Flink.
Real-Time Processing: Handle data integration, transformation, and processing in real-time environments.
Collaboration: Work closely with engineering teams to integrate streaming solutions into the broader data ecosystem.
Reliability: Implement monitoring and alerting systems to ensure the reliability and performance of data pipelines.
Optimization: Develop fault-tolerant systems and optimize data flows for scalability and efficiency.
Key Requirements:
Expertise in Kafka and Flink: Proven experience with these technologies is essential.
Real-Time Data Processing: Strong background in distributed systems and real-time data solutions.
Data Engineering Experience: Skilled in building scalable and maintainable data pipelines.
Programming Proficiency: Advanced knowledge of Python, Java, or Scala.
Cloud Platforms: Experience with AWS, GCP, or Azure is a significant advantage.
DevOps Familiarity: Understanding of DevOps tools and principles.
Additional Skills:
Problem-Solving: Analytical mindset with the ability to tackle complex challenges.
Independence: Ability to work autonomously while managing intricate projects.
Communication: Strong verbal and written communication skills for effective collaboration with cross-functional teams.