Your Responsibilities
Work on scalable data pipelines to efficiently process and analyze large volumes of data, utilizing Snowflake, Looker, Airflow and dbt.
Collaborate with stakeholders to help translate their requirements into technical steps and coordinate the projects you drive with them.
Monitor and improve the health of our data pipelines.
Promote knowledge sharing within the team to foster collaboration and continuous learning.
Stay updated on emerging technologies and best practices in data engineering and bring new ideas to enhance the technical set-up.
Your Profile
Solid experience with Airflow, Python programming and SQL / dbt.
Experience with cloud data warehouses such as Snowflake is a plus.
Knowledge of AWS services and Docker is a plus.
Bachelors in Mathematics, Computer Science or other relevant quantitative fields.
Strong analytical and quantitative approach to problems.
Familiarity with Data Engineering best practices, including Data Quality and Observability.
Comfortable working in a dynamic and changing environment, with a strong sense of responsibility.
Eagerness to learn and grow through on-the-job learning.
Our Benefits
As part of our team, you will benefit from: