Who We Are
Since its inception, Aventum Group has sought a different approach to insurance. We are on a mission to be the ‘most inspiring speciality (re)insurance group in the world’.
At the heart of Aventum are our people. Working together in dynamic, service-focused teams, we prioritise our customers in everything we do. Collaboration fuels our success, courage drives our innovation and continuous improvement keeps us ahead in a rapidly evolving industry. Our shared commitment is to revolutionise insurance for the better, one day at a time.
We also believe that investing in our people is investing in our future. By empowering people across the Group to develop their careers, advance within the Group, and embrace new challenges, we build an environment where growth and learning never stop.
Our competitive benefits package, offered via a flexible benefits platform, reflects this. Beyond core benefits, employees have the freedom to tailor their benefits to meet their individual needs, supporting their unique goals and ambitions.
Role Summary
You will work in various settings to build systems that collect, manage, and convert raw data into usable information for data scientists and business analysts to interpret. Your ultimate goal is to make data accessible so that Aventum stakeholders can use it to evaluate and optimise their performance.
Role Accountabilities
Strategy
- Support to design, develop, implement, manage and support enterprise-level ELT/ELT processes and environment.
- Technical and business processes are used to combine data from multiple sources to provide a unified, single view of the data. Be an architect making strategic decisions.
- Partly responsible for accessing, validating, and querying data from various repositories using available tools. Build and maintain data integration processes using SQL Services and other ETL / ELT processes and scripting tools as well as ongoing requests and projects related to the data warehouse, MI, or fast-moving financial data.
- Designing the infrastructure/architecture of the big data platform.
- Evaluating, comparing and improving the different approaches including design patterns innovation, data lifecycle design, data ontology alignment, annotated datasets, and elastic search approaches.
- Developing, creating and maintaining a reliable data pipeline and schemas that feed other data processes; includes both technical processes and business logic to transform data from disparate sources into cohesive meaningful and valuable data with quality, governance and compliance considerations.
- Customising and managing integration tools, databases, warehouses, and analytical systems.
- Identifying and eliminating all non-value-adding activities through automation or outsourcing.
Operations
- Design and implement the management, monitoring, security, and privacy of data using the full stack of Azure data services to satisfy business needs.
- Ensuring non-functional system characteristics such as security, maintainability, quality, performance, and reliability are captured, prioritized, and incorporated into products.
- Leverage Agile, CI / CD and DevOps methodologies to deliver high-quality products on time.
- Architecting, building, testing, and maintaining the data platform as a whole. Develop and support a wide range of data transformations and migrations for the whole business.
- Construct custom ETL processes: Design and implement data pipelines, data marts and schemas, access versatile data sources and apply data quality measures.
- Monitoring the complete process and applying necessary infrastructure changes to speed up the query execution and analyse data more efficiently; this includes Database optimisation techniques (data partitioning, database indexing and denormalisation) & efficient data ingestion (data mining techniques and different data ingestion APIs).
- Responding to errors and alerts to correct and re-process data. Investigate data mismatches and apply solutions. Data scrubbing and analysis to troubleshoot data integration issues and determine root cause.
- Any additional duties as assigned.
Role Requirements
- Bachelor’s degree or equivalent in an engineering/numerate subject (e.g. Engineering, Stats, Maths, Computer Sciences)
- Experience in full-stack development and applying it to build science products (E.g. could include some or all Python/R, Linux scripting, SQL, Docker coupled with front ends such as Javascript)
- Some experience as a developer building data pipelines and schemas, with data WH implementation, with SQL database development.
- Hands-on experience using Synapse or related tools with cloud-based resources (e.g. Stored Procedures, ADF, NoSQL Databases, JSON/XML data formats).
- Hands-on experience with Azure Functions, Azure service bus, Azure Data Factory data integration techniques.
- Knowledge of Data Modelling concepts, monitoring, designs and techniques.
- Knowledge of Data Warehouse project lifecycle, tools, technologies ,and best practices.
- Experience using Cloud Computing platforms (ADLS Gen2), Microsoft Stack (Synapse, DataBricks, Fabric, Profisee), Snowflake Data Integration, Azure Service Bus, Deltalake, Azure DevOps, Azure Monitor, Azure Data Factory, SQL Server, Azure DataLake Storage, Azure App Service.
- Experience with Azure SQL Database, Cosmos DB, NoSQL, MongoDB.
- Experience with Agile, and DevOps methodologies.
- Awareness and knowledge of ELT/ETL, DWH, APIs (RESTful), Spark APIs, FTP protocols, SSL, SFTP, PKI (Public Key Infrastructure) and Integration testing.
Skills and Abilities
- Knowledge of Python, SQL, SSIS, and Spark languages.
- Demonstrative ability to develop complex SQL queries and Stored Procedures.
- Relationship-building and stakeholder management.
Management Duties
We are an equal-opportunity employer, and we are proud to share that 93% of our employees say they can be themselves at work. We aim to hire our industry's finest people because the best people drive the best outcomes. And we forever challenge the status quo because we know there are always ways to improve things. Because together, we're limitless.
We value applicants from all backgrounds and foster a culture of inclusivity. We understand the need for flexibility, so work in a hybrid model. Please let us know if you require any reasonable adjustments during the recruitment process.
FCA Conduct Rules
Under the Senior Managers and Certification Regime the FCA and Aventum expect that:
- You must act with integrity.
- You must act with due skill, care and diligence.
- You must be open and cooperative with the FCA, the PRA and other regulators.
- You must pay due regard to the interests of customers and treat them fairly.
- You must observe proper standards of market conduct.
- You must act to deliver good outcomes for retail customers.