Enable job alerts via email!

Senior Data Engineer - Commodity Trading | London, UK | Hybrid

twentyAI

London

On-site

GBP 50,000 - 90,000

Full time

30+ days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

Join a leading firm in the commodity trading sector as a Senior Data Engineer, where you'll design and optimize cutting-edge data platforms. This dynamic role involves working with modern technologies like Snowflake and Databricks, ensuring seamless data integration for business and regulatory needs. You'll collaborate with engineers and stakeholders to drive data innovation and support best practices in data engineering and DevOps. If you're passionate about data and ready for a challenge, this is the opportunity for you!

Qualifications

  • Proven experience in data engineering, ideally in trading or financial services.
  • Strong knowledge of modern data platforms and technologies.

Responsibilities

  • Build and optimize data platforms for scalability and performance.
  • Implement best practices for data engineering and support business users.

Skills

Data Engineering
Problem-solving
Communication Skills
Python
SQL
DevOps

Tools

Snowflake
Databricks
Synapse/Fabric
PySpark
Power BI
Tableau
MS SQL Server
Oracle
Azure

Job description

Senior Data Engineer - Commodity Trading

Are you a skilled Data Engineer with experience in building and optimizing data platforms? Join a leading firm in the commodity trading sector, where you'll play a key role in developing scalable, high-performance data solutions.

The Role:

As a Senior Data Engineer, you will be responsible for designing, developing, and maintaining robust data platforms. You’ll work with modern technologies such as Snowflake, Databricks, Synapse/Fabric, and PySpark, ensuring seamless data integration and accessibility for business and regulatory needs.

Key Responsibilities:

  1. Build and optimize data platforms for scalability, security, and performance. Develop and maintain data pipelines and models, integrating tools like Power BI and Tableau.
  2. Implement best practices for data engineering and DevOps. Support business users in deploying and maintaining data solutions.
  3. Ensure compliance with regulatory data requirements.
  4. Collaborate with engineers and stakeholders to drive data innovation.

Requirements:

  1. Proven experience in data engineering, ideally within trading or financial services.
  2. Strong knowledge of Snowflake, Databricks, Synapse/Fabric, PySpark, and Azure.
  3. Expertise in data modeling, pipelines, and orchestration.
  4. Proficiency in Python and SQL (MS SQL Server, Oracle).
  5. Experience with DevOps, performance optimization, and data governance.
  6. Excellent problem-solving and communication skills.

This is an opportunity to work in a dynamic, fast-paced environment where data plays a crucial role in decision-making. If you’re passionate about data engineering and ready to take on a new challenge, apply now!

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.