Enable job alerts via email!

Data Engineer

Axiom Software Solutions Limited

Cambridge

Hybrid

GBP 60,000 - 80,000

Full time

9 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative firm is seeking a skilled Data Engineer with extensive experience in Snowflake and DBT to join their dynamic team. In this long-term hybrid role, you will design and optimize ETL/ELT pipelines, leveraging your strong Python programming skills to automate data processing tasks. The ideal candidate will work collaboratively with data engineers and analysts to ensure data quality and reliability. This position offers the opportunity to be part of exciting projects while working in a flexible environment that values your expertise and contributions. If you are passionate about data engineering and looking for a challenging role, this opportunity is perfect for you.

Qualifications

  • 5+ years of experience in data engineering with Snowflake and DBT.
  • Strong Python skills for automation and data processing.

Responsibilities

  • Design and optimize ETL/ELT pipelines using Snowflake and DBT.
  • Collaborate with stakeholders to understand data needs.

Skills

Snowflake
DBT
Python
AWS
SQL Performance Tuning
Analytical Skills
Problem-Solving

Education

Bachelor's Degree in Computer Science or related field

Tools

Git
Airflow

Job description

Position: Data Engineer

Location: Cambridge / Luton, UK (Hybrid 2-3 days onsite in a week)

Duration: Long Term B2B Contract

Job Description:

The ideal candidate will have a minimum of 5+ years of experience working with Snowflake, DBT, Python, and AWS to deliver ETL/ELT pipelines using various resources.

  1. Proficiency in Snowflake data warehouse architecture. Design, build, and optimize ETL/ELT pipelines using DBT (Data Build Tool) and Snowflake.
  2. Experience with DBT (Data Build Tool) for data transformation and modeling. Implement data transformation workflows using DBT (core/cloud).
  3. Strong Python programming skills for automation and data processing. Leverage Python to create automation scripts and optimize data processing tasks.
  4. Proficiency in SQL performance tuning and query optimization techniques using Snowflake.
  5. Troubleshoot and optimize DBT models and Snowflake performance.
  6. Knowledge of CI/CD and version control (Git) tools. Experience with orchestration tools such as Airflow.
  7. Strong analytical and problem-solving skills with the ability to work independently in an agile development environment.
  8. Ensure data quality, reliability, and consistency across different environments.
  9. Collaborate with other data engineers, data analysts, and business stakeholders to understand data needs and translate them into engineering solutions.
  10. Certification in AWS, Snowflake, or DBT is a plus.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.