Enable job alerts via email!

Data Engineer

Experis - ManpowerGroup

City Of London

Remote

GBP 50,000 - 80,000

Full time

30 days ago

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative firm is seeking a skilled Data Engineer with a robust understanding of data concepts and JSON manipulation. This role involves developing and maintaining data pipelines using Python and PySpark while ensuring data security and compliance with regulations. You will work within a dynamic Agile environment, contributing to impactful data processing projects. If you have a passion for data and problem-solving, this opportunity offers a chance to make significant contributions in a supportive and collaborative team. Join a forward-thinking company that values your expertise and provides a platform for growth and innovation.

Qualifications

  • Strong understanding of data concepts and complex JSON manipulation.
  • Experience with Data Pipelines using Python/PySpark frameworks.

Responsibilities

  • Develop and maintain data pipelines with a focus on data security and integrity.
  • Collaborate in an Agile environment to enhance data processing workflows.

Skills

Data concepts
JSON manipulation
Data Pipelines with Python/PySpark
Data Security principles
Problem solving skills
Support role experience
Linux system administration
Agile methodologies

Tools

Jupyter Notebooks
Azure Databricks
Apache Spark
PowerBI
JIRA

Job description

Data Engineer

Remote

7 months

Inside ir35 - Umbrella only

Required skills

  • Strong understanding of data concepts - data types, data structures, schemas (both JSON and Spark), schema management etc.
  • Strong understanding of complex JSON manipulation.
  • Experience working with Data Pipelines using custom Python/PySpark frameworks.
  • Strong understanding of the 4 core Data categories (Reference, Master, Transactional, Freeform) and the implications of each, particularly managing/handling Reference Data.
  • Strong understanding of Data Security principles - data owners, access controls - row and column level, GDPR etc., including experience of handling sensitive datasets.
  • Strong problem solving and analytical skills, particularly able to demonstrate these intuitively (able to work a problem out, not follow a work instruction to resolve).
  • Experience working in a support role would be beneficial, particularly able to demonstrate incident triage and handling skills/knowledge (SLAs etc.).
  • Fundamental Linux system administration knowledge - SSH keys and config etc., Bash CLI and scripting, environment variables.
  • Experience using browser-based IDEs (Jupyter Notebooks, RStudio etc.).
  • Experience working in a dynamic Agile environment (SAFE, Scrum, sprints, JIRA etc.).

Languages / Frameworks

  • JSON
  • YAML
  • Python (as a programming language, not just able to write basic scripts; Pydantic experience would be a bonus).
  • SQL
  • PySpark
  • Delta Lake
  • Bash (both CLI usage and scripting).
  • Git
  • Markdown
  • Scala (bonus, not compulsory).
  • Azure SQL Server as a HIVE Metastore (bonus).

Technologies

  • Azure Databricks
  • Apache Spark
  • Delta Tables
  • Data processing with Python
  • PowerBI (Integration / Data Ingestion)
  • JIRA

If this is the role for you, please submit your CV at your earliest convenience. If you have not had a response within 2 weeks, please take this as you have not been successful on this occasion.

Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.