Enable job alerts via email!

Data Engineer (Real Time) (Remote)

RemoteStar

Cambridge

Hybrid

GBP 45,000 - 75,000

Yesterday
Be an early applicant

Boost your interview chances

Create a job specific, tailored resume for higher success rate.

Job summary

An innovative iGaming operator is looking for a Data Engineer to join their dynamic team. In this role, you'll design and develop real-time data processing applications, leveraging cutting-edge technologies to enhance their data platform. You'll work closely with talented professionals in a hybrid work environment, balancing office collaboration with remote flexibility. If you have a passion for data and enjoy problem-solving in a fast-paced setting, this is the perfect opportunity to showcase your skills and contribute to exciting projects that impact users worldwide.

Qualifications

  • Strong knowledge in Scala and distributed computing frameworks.
  • Experience with AWS services and data monitoring tools.

Responsibilities

  • Develop and maintain real-time data processing applications.
  • Collaborate with multi-disciplined teams in an Agile environment.

Skills

Scala

Spark Streaming

Kafka

Agile Methodologies

Analytical Skills

Problem-solving

Data Warehouse Concepts

ETL Concepts

Containerization

Microservice Architecture

Tools

Kafka Streams

Docker

Kubernetes

AWS

Elasticsearch

Prometheus

Grafana

Git

Hadoop

Snowflake

Job description

DATA ENGINEER (Real Time)

About client:

At RemoteStar we are currently hiring for a client who is a world-class iGaming operator offering various online gaming products across multiple markets, both through their proprietary gaming sites and partner brands.

Their iGaming platform is central to their strategy, supporting over 25 online brands and growing, and it's used by hundreds of thousands of users worldwide. Our client embraces a Hybrid work-from-home model, with the flexibility of working three days in the office and two days from home.


About the Data Engineer role:

In this role, you will contribute to the design and development of Real-Time Data Processing applications to fulfil business needs.

For any Technical Data wiz out there, this is the perfect environment to put your skills to the test by building a consolidated Data Platform with innovative features and most importantly joining a bunch of talented and fun group of people.


What you will be involved in:
  1. Development and Maintenance of Real-Time Data Processing Applications by using frameworks and libraries such as: Spark Streaming, Spark Structured Streaming, Kafka Streams and Kafka Connect.
  2. Manipulation of Streaming Data: Ingestion, Transformation and Aggregation.
  3. Keeping up to date on Research and Development of new Technologies and Techniques to enhance our applications.
  4. Collaborating closely with the Data DevOps, Data-Oriented streams and other multi-disciplined teams.
  5. Comfortable working in an Agile Environment involving SDLC.
  6. Familiar with the Change and Release Management Process.
  7. Have an investigative mindset to be able to troubleshoot - thinking outside the box when it comes to troubleshooting problems and incident management.
  8. Full ownership of Projects and Tasks assigned together with being able to work within a team.
  9. Able to document well processes and perform Knowledge Sharing sessions with the rest of the team.

You're good with:
  1. Have strong knowledge in Scala.
  2. Knowledge or familiarity of Distributed Computing like Spark/KStreams/Kafka.
  3. Connect and Streaming Frameworks such as Kafka.
  4. Knowledge on Monolithic versus Microservice Architecture concepts for building large-scale applications.
  5. Familiar with the Apache suite including Hadoop modules such as HDFS, Yarn, HBase, Hive, Spark as well as Apache NiFi.
  6. Familiar with containerization and orchestration technologies such as Docker, Kubernetes.
  7. Familiar with Time-series or Analytics Databases such as Elasticsearch.
  8. Experience with Amazon Web Services using services such as S3, EC2, EMR, Redshift.
  9. Familiar with Data Monitoring and Visualisation tools such as Prometheus and Grafana.
  10. Familiar with software versioning tools like Git.
  11. Comfortable working in an Agile environment involving SDLC.
  12. Have a decent understanding of Data Warehouse and ETL concepts - familiarity with Snowflake is preferred.
  13. Have strong analytical and problem-solving skills.
  14. Good learning mindset.
  15. Can effectively prioritize and handle multiple tasks and projects.
Get your free, confidential resume review.
or drag and drop a PDF, DOC, DOCX, ODT, or PAGES file up to 5MB.

Similar jobs

Data Platform Engineer

Only for registered members

Cambridge

Remote

GBP 50,000 - 90,000

3 days ago
Be an early applicant

Data Platform Engineer

Only for registered members

Watford

Remote

GBP 50,000 - 90,000

3 days ago
Be an early applicant

Data Platform Engineer

Only for registered members

Cambridgeshire and Peterborough

Remote

GBP 50,000 - 90,000

3 days ago
Be an early applicant

Data Platform Engineer

Only for registered members

Peterborough

Remote

GBP 50,000 - 90,000

3 days ago
Be an early applicant

Senior Data Engineer

Only for registered members

Glasgow

Remote

GBP 60,000 - 90,000

Today
Be an early applicant

Senior Data Engineer

Only for registered members

England

Remote

GBP 60,000 - 70,000

Yesterday
Be an early applicant

Data Engineer (Databricks)

Only for registered members

Greater London

Remote

GBP 70,000 - 100,000

Today
Be an early applicant

Senior Data Engineer - Remote, UK

Only for registered members

Remote

GBP 50,000 - 90,000

Today
Be an early applicant

Data Engineer

Only for registered members

London

Remote

GBP 60,000 - 100,000

6 days ago
Be an early applicant