Enable job alerts via email!

Data Architect

Experis - ManpowerGroup

England

Hybrid

GBP 125,000 - 150,000

Full time

7 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a DPS Data Architect to lead the development of innovative data platforms. In this hybrid role, you will focus on optimizing data pipelines and enhancing self-service capabilities for data analysis. You will collaborate with cross-functional teams to improve technology stacks and ensure compliance with data privacy regulations. This role offers the opportunity to work in a dynamic environment, utilizing your expertise in ETL processes and agile methodologies to drive impactful data solutions. If you thrive in a collaborative setting and are passionate about data engineering, this is the perfect opportunity for you.

Qualifications

  • Experience in leading and developing data platforms with hands-on expertise.
  • Strong analytical and problem-solving skills essential for data engineering.

Responsibilities

  • Lead the development of data platforms and optimize data pipelines.
  • Collaborate with stakeholders to resolve data-related technical issues.

Skills

Data Platform Development
ETL / ELT Data Pipelines
Analytical Skills
Problem-Solving Skills
Agile Methodology
Communication Skills
Data Privacy Knowledge

Tools

Splunk
Kafka
Grafana
SQL
CICD Pipelines

Job description

Role Title: DPS Data Architect
Location: Hybrid - Sheffield / Birmingham / Edinburgh 3 days per week
Duration: 28/11/2025
Rate: £450 per day - Umbrella Only

Role Description:

  • You will be a technical lead aligned to our Storage & Data Protection Services infrastructure team. Your primary focus will be on building, expanding, and optimising our data platforms.
  • You will develop high performance data products and data pipelines to further enable our data driven approach.
  • You will support the improvement of our data self-service capability, building the technology to allow users to analyse the data they need on demand.
  • You will be a part of a highly skilled, self-organising team whilst building forward-thinking solutions and creating new capabilities to support multiple, cross-functional teams. We are continuously looking to further improve our technology stack, data quality and reliability, and your vision and ambition will contribute to shaping our solutions toward data-driven decisions across the business.

This role will carry out some or all of the following responsibilities:

  • Work alongside our SME/Storage Architect to deliver a strategic solution that will uplift our monitoring and alerting capability to higher maturity levels, positioning us to deliver end-to-end observability.
  • Support the design of the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using APIs, SQL and big data technologies.
  • Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements. Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability.
  • Work with stakeholders to assist with data-related technical issues and delivery. Work with data and analytics experts to strive for greater functionality in our data systems.
  • Keep up-to-date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable. Ensuring compliance with all relevant controls and standards.

We are looking for a candidate with experience in leading/developing a data platform, you should also have hands-on experience in most of the following key areas:

  • Working with raw data, structured, semi-structured and unstructured data.
  • Experience combining large, disconnected datasets, using relevant tools and frameworks.
  • Building and optimising ETL / ELT data pipelines.
  • Experience of source control, Continuous Integration, Delivery and Deployment through CICD Pipelines.
  • Knowledge and/or experience with Splunk, Kafka & Grafana is beneficial.
  • Supporting and working with BI and Analytics teams in a dynamic environment.
  • Knowledge of Scrum, Kanban or other agile frameworks.
  • Work with Agile methodology, representing the Pod and Area lead in standups and problem-solving meetings.
  • Enables SRE culture through solving problems with data engineering.
  • Experience working in relevant market/context, i.e. IT in finance, is desirable.
  • Able to collaborate and effectively pair with other engineers/architects.
  • Strong analytical skills and problem-solving skills.
  • Self-awareness with confidence to work independently and take responsibility for own development.
  • Excellent written and spoken communication skills – an ability to communicate with impact, ensuring complex information is articulated in a meaningful way to wide and varied audiences.
  • Willingness to undertake the training / study required in this role for new products and services.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.