Enable job alerts via email!

Data Lakehouse Developer

ZILO

London

Hybrid

GBP 45,000 - 80,000

Full time

8 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is looking for a skilled Data Engineer to join their innovative team. This role involves designing and building efficient data pipelines, performing ETL operations, and managing databases using cutting-edge AWS services. With a focus on collaboration and continuous improvement, you will work closely with data scientists and business analysts to translate data requirements into technical solutions. Join a forward-thinking company that values your expertise and offers a flexible working environment, competitive benefits, and the opportunity to make a real impact in the financial services industry.

Benefits

Enhanced leave - 38 days inclusive of 8 UK Public Holidays
Private Health Care including family cover
Life Assurance – 5x salary
Flexible working - work from home and/or in our London Office
Employee Assistance Program
Company Pension (Salary Sacrifice options available)
Access to training and development
Buy and Sell holiday scheme
Work from anywhere/global mobility opportunity

Qualifications

  • 3+ years of experience in data engineering roles.
  • Strong proficiency in SQL and ETL processes.
  • Hands-on experience with AWS services for data processing.

Responsibilities

  • Design and maintain efficient data pipelines for processing and analysis.
  • Implement ETL processes to ensure data quality and integrity.
  • Collaborate with teams to deploy and troubleshoot data systems.

Skills

SQL
ETL processes
Database management
Problem-solving
Communication
Collaboration

Education

Bachelor's degree in computer science
Bachelor's degree in engineering

Tools

AWS (S3, Redshift, EMR, Glue)
MySQL
PostgreSQL
MongoDB
Hadoop
Spark
Python
Java

Job description

ZILO is focused on transforming global transfer agency by providing a single global solution that replaces legacy technology and systems. Our mission is to create sustainable value for firms and the customers they serve, all while putting people first. With a design-driven approach and a commitment to innovation, our team of experts has unified the full breadth of global transfer agency into a single, streamlined solution.

At ZILO, we value collaboration, innovation, and continuous improvement. Our team of experienced professionals is dedicated to achieving our goal of being the market-leading solution in global transfer agency. If you are a motivated individual with a passion for technology and a desire to make a real impact in the financial services industry, we would love to hear from you.

About this role

We are seeking a skilled Data Engineer – Data Lakehouse to join our team. The ideal candidate should be experienced in designing and building data pipelines, performing ETL operations, and managing databases. The candidate will also need to have knowledge and experience in using AWS services for data processing, storage, and analysis.

Responsibilities

  • Design, build and maintain efficient and scalable data pipelines to support data processing and analysis.
  • Develop and implement ETL processes to ensure data quality and integrity.
  • Manage and optimize databases, including designing schema and indexing strategies, data partitioning and data archival.
  • Work closely with data scientists and business analysts to understand data requirements and translate them into technical solutions.
  • Collaborate with DevOps and IT teams to deploy, monitor and troubleshoot data systems.
  • Develop and maintain documentation on data infrastructure and processes.
  • Stay current with the latest data engineering technologies, tools and practices.

Qualifications

  • Bachelor's degree in computer science, engineering or related field.
  • Minimum of 3 years of experience in data engineering roles.
  • Strong proficiency in SQL, ETL processes and database management systems (e.g., MySQL, PostgreSQL, MongoDB).
  • Hands-on experience with AWS services for data processing, storage and analysis (e.g., S3, Redshift, EMR, Glue).
  • Familiarity with programming languages such as Python or Java.
  • Understanding of data warehousing concepts and data modeling techniques.
  • Experience working with big data technologies (e.g., Hadoop, Spark) is an advantage.
  • Excellent problem-solving and analytical skills.
  • Strong communication and collaboration skills.

Benefits

  • Enhanced leave - 38 days inclusive of 8 UK Public Holidays
  • Private Health Care including family cover
  • Life Assurance – 5x salary
  • Flexible working - work from home and/or in our London Office
  • Employee Assistance Program
  • Company Pension (Salary Sacrifice options available)
  • Access to training and development
  • Buy and Sell holiday scheme
  • The opportunity for “work from anywhere/global mobility”
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.