Enable job alerts via email!

Data Architect

Experis

United Kingdom

Hybrid

GBP 125,000 - 150,000

Full time

2 days ago
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player seeks a DPS Data Architect to lead the development of innovative data platforms. In this hybrid role, you will focus on building and optimizing data pipelines, ensuring high performance and scalability. Collaborate with cross-functional teams to enhance data self-service capabilities and drive data-driven decisions across the organization. Ideal candidates will thrive in a fast-paced environment, leveraging the latest tools and technologies to tackle evolving challenges. If you are passionate about data architecture and want to make a significant impact, this opportunity is perfect for you.

Qualifications

  • Experience in leading/developing a data platform with hands-on experience in key areas.
  • Strong analytical and problem-solving skills with excellent communication.

Responsibilities

  • Deliver strategic solutions to enhance monitoring and alerting capabilities.
  • Create and maintain optimal data pipeline architecture for complex datasets.
  • Support stakeholders with data-related technical issues and delivery.

Skills

Data Platform Development
ETL / ELT Data Pipelines
Data Analysis
Agile Methodology
Problem-Solving Skills
Communication Skills
Analytical Skills

Tools

Splunk
Kafka
Grafana
CICD Pipelines

Job description

Role Title: DPS Data Architect
Location: Hybrid - Sheffield / Birmingham / Edinburgh 3 days per week
Duration: 28/11/2025
Rate: £450 per day - Umbrella Only

Role Description:
You will be a technical lead aligned to our Storage & Data Protection Services infrastructure team. Your primary focus will be on building, expanding, and optimising our data platforms. You will develop high performance data products and data pipelines to further enable our data driven approach. You will support the improvement of our data self-service capability, building the technology to allow users to analyse the data they need on demand. You will be a part of a highly skilled, self-organising team whilst building forward-thinking solutions and creating new capabilities to support multiple, cross-functional teams. We are continuously looking to further improve our technology stack, data quality and reliability, and your vision and ambition will contribute to shaping our solutions toward data-driven decisions across the business. The ideal candidate is self-directed, comfortable with challenging and leading on best practice, and able to adapt to regularly shifting business requirements and occasional ambiguity. This is a fast-paced hands-on role and would be well-suited to someone who loves clean design, clean architecture and using the latest tools and technology to tackle constantly evolving business and tech challenges.

Responsibilities:

  1. Work alongside our SME/Storage Architect to deliver a strategic solution that will uplift our monitoring and alerting capability to higher maturity levels, positioning us to deliver end-to-end observability.
  2. Support the design of the infrastructure required for optimal extraction, transformation, and loading of data from a wide variety of data sources using APIs, SQL and big data technologies.
  3. Create and maintain optimal data pipeline architecture. Assemble large, complex data sets that meet functional/non-functional business requirements.
  4. Identify, design, and implement internal process improvements: automating manual processes, optimising data delivery, re-designing infrastructure for greater scalability.
  5. Work with stakeholders to assist with data-related technical issues and delivery. Work with data and analytics experts to strive for greater functionality in our data systems.
  6. Keep up-to-date and have expertise on current tools, technologies and areas like cyber security and regulations pertaining to aspects like data privacy, consent, data residency etc. that are applicable. Ensuring compliance with all relevant controls and standards.

Minimum Requirements:
We are looking for a candidate with experience in leading/developing a data platform, you should also have hands-on experience in most of the following key areas:

  1. Working with raw data, structured, semi-structured and unstructured data.
  2. Experience combining large, disconnected datasets, using relevant tools and frameworks.
  3. Building and optimising ETL / ELT data pipelines.
  4. Experience of source control, Continuous Integration, Delivery and Deployment through CICD Pipelines.
  5. Knowledge and/or experience with Splunk, Kafka & Grafana is beneficial.
  6. Supporting and working with BI and Analytics teams in a dynamic environment.
  7. Knowledge of Scrum, Kanban or other agile frameworks.
  8. Work with Agile methodology, representing the Pod and Area lead in standups and problem-solving meetings.
  9. Enables SRE culture through solving problems with data engineering.
  10. Experience working in relevant market/context, i.e. IT in finance, is desirable.
  11. Able to collaborate and effectively pair with other engineers/architects.
  12. Strong analytical skills and problem-solving skills.
  13. Self-awareness with confidence to work independently and take responsibility for own development.
  14. Excellent written and spoken communication skills - an ability to communicate with impact, ensuring complex information is articulated in a meaningful way to wide and varied audiences.
  15. Willingness to undertake the training / study required in this role for new products and services.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.