System Design & Architecture: Define and develop technical specifications, system designs, and scalable architectures for large-scale data infrastructure.
Build and optimize ETL / ELT pipelines to process large, diverse datasets with a focus on performance and reliability.
Expertise with real-time data streaming platforms like Spark / Kafka.
In-depth knowledge of Databricks and cloud-based data architectures for processing and storage at scale.
Proven ability to design, build, and optimize ETL / ELT pipelines using tools like Apache Airflow, DBT, and Apache NiFi for both batch and real-time processing.
For more info, please email me at J-18808-Ljbffr
Obtenga la revisión gratuita y confidencial de su currículum.