Enable job alerts via email!

Data Engineer

NorthWest EHealth Limited

Greater Manchester

On-site

GBP 40,000 - 70,000

Yesterday
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An established industry player is seeking a skilled data engineer to enhance and maintain data pipelines and infrastructure. This role involves developing ETL processes, ensuring data quality, and collaborating with cross-functional teams to deliver effective solutions. The ideal candidate will have a strong background in Microsoft products and cloud technologies, particularly Azure. Join a dynamic team committed to continuous improvement and innovation in a highly regulated environment. If you are passionate about data and eager to make a significant impact, this opportunity is perfect for you.

Qualifications

  • Mindestens 3 Jahre Erfahrung in der Entwicklung von Datenpipelines und ETLs.
  • Gute Kenntnisse in Cloud-Technologien, insbesondere Azure Data Factory.

Responsibilities

  • Entwicklung und Optimierung von Datenpipelines und ETL-Prozessen.
  • Zusammenarbeit mit Teams zur Definition von Lösungen und Anforderungen.

Skills

Datenpipelines entwickeln

ETL-Prozesse

Dateninfrastruktur

Datenbanksicherheit

Cloud-Technologien (Azure)

Zusammenarbeit im Team

Agile Methoden

Education

Guter Abschluss in einem relevanten Fach

Tools

SQL Server

SSIS

Azure Data Factory

Atlassian Tools (JIRA, Confluence, BitBucket)

Job description

Job Purpose:

To develop, enhance and maintain the data pipelines and data infrastructure across NWEH's software products. To provide expertise and guidance for all data related work across the business.

Key accountabilities:

  1. The development, performance, management, and troubleshooting of the ETL processes, data pipelines and data infrastructure supporting NWEH's software and service products
  2. The effective and reliable operation of NWEH's data pipelines and ETL processes
  3. Appropriate adoption of new tools, technologies and practices to ensure the team stay up to date and follow industry best practice
  4. Supporting data work across the team and wider business

Key responsibilities:
  1. Designing, developing, maintaining, and optimising data pipelines, ETL processes, and databases
  2. Driving continuous improvement by refining processes, products and identifying new tools, standards, and practices
  3. Working with teams across the business to define solutions, requirements, and testing approaches
  4. Assisting with process definition, always ensuring compliance with organisation processes and regulatory standards
  5. Ensuring compliance with regulatory requirements and standards and audit readiness
  6. Automation and ongoing monitoring of data and data processes, ensuring data quality and integrity
  7. Share knowledge and provide guidance on databases and data
  8. Maintain up to date, accurate and concise documentation of database configuration and processes
  9. Work across the team to deliver best practice infrastructure and infrastructure deployment and management processes

Essential skills/experience:
  1. A good degree in a relevant subject or equivalent professional experience in a data role
  2. At least 3 years' professional experience developing data pipelines and ETLs using Microsoft products; at least 1 year working with cloud native technologies like Azure Data Factory
  3. Demonstrable experience of delivering technical work within time and budget constraints
  4. Good understanding of data security best practice
  5. Experience of supporting ETLs or data pipelines crucial to a production system
  6. Experience of working in a cross-functional team to deliver technical solutions

Desirable skills:
  1. Experience with SQL Server, SSIS, Azure Data Factory and Azure SQL
  2. Experience with Cloud and infrastructure as Code, particularly in an Azure setting using Bicep
  3. Understanding of DevOps practices and the associated benefits
  4. Skill in database testing; unit, performance, stress, security
  5. Experience of working in an agile team
  6. Experience of working in a highly regulated industry and with highly sensitive data
  7. Exposure to large data solutions like Snowflake, Trino, Synapse, Azure Data Lake, and Databricks
  8. Experience of data science using R, Stata or Python
  9. Familiarity with Atlassian tools; JIRA, Confluence, BitBucket
  10. Understanding of clinical trials, GCP and GxP

Personal attributes:
  1. Collaboration - teamwork, listening and communication
  2. Professionalism and commitment to delivery and improvement
  3. Curiosity and continuous learning
  4. Attention to detail but solution oriented
  5. Adaptability
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Senior Data Engineer - Remote, UK

Only for registered members

Remote

GBP 50,000 - 90,000

Yesterday
Be an early applicant

Senior Data Engineer

Only for registered members

Greater London

Remote

GBP 50,000 - 90,000

Today
Be an early applicant

Senior Data Engineer

Only for registered members

England

Remote

GBP 60,000 - 70,000

Yesterday
Be an early applicant

Senior Data Engineer

Only for registered members

London

Remote

GBP 50,000 - 90,000

Yesterday
Be an early applicant

Data Engineer (SAS) - Remote

Only for registered members

Remote

GBP 40,000 - 80,000

Yesterday
Be an early applicant

Data Engineer

Only for registered members

Remote

GBP 40,000 - 80,000

Yesterday
Be an early applicant

Data Engineer

Only for registered members

Remote

GBP 40,000 - 80,000

Yesterday
Be an early applicant

Data Consultant - Data Engineer

Only for registered members

Liverpool City Region

Remote

GBP 40,000 - 68,000

Yesterday
Be an early applicant

Senior Data Engineer (m/f/d)

Only for registered members

London

Remote

GBP 50,000 - 90,000

Yesterday
Be an early applicant