Our client is looking for a DevOps in the context of one of the Data Center of Excellence and has the ambition to offer expertise, advice, and change levers to assist and accelerate global and local data projects.
Integrated into the Data Center of Excellence, the DevOps will work with the Data Architect. He’ll be responsible for setting up the best practices and building templates/software frameworks that will be used by other teams.
Missions :
- Rationalize environments, practices & manage technical debt.
- Refactor & optimize data systems (Data CoE’s Analytics/Platform/Cleansing Services and KDPs’ architecture).
Use cases illustrations :
- Review and adjust KDPs software and containers logging, diagnostics, and alerts. Enable live site reviews and capture feedback for system outages (liaise with UX team).
- Set up templates & guidelines to smooth development collaboration, this means working on Azure DevOps CI & CD pipelines, setting up coding best practices (which code structure with class, functions, objects, coding principles like separation of concerns) as well as branching/releases patterns/deployment strategy/git hooks.
- Refactor projects’ docker-compose and dockerfile.
- Set up blueprints to enable automated code quality insurance (security, semantic, non-regression, load...) with linter, code coverage, Sonar, Snyk, automatic tests...
- Build projects’ cookiecutter and developers’ SDK.
- Redesign KDPs APIs that serve ML to have a central and extendable ML serving API that is common for all AI projects. Refactor the API to decouple it from the ML back end, leverage (and enhance) existing toolbox (for IO, etc).
Expected profile :
- Bachelor’s and/or Master’s degree from Engineering School / University.
- Deep knowledge in DevOps & software development.
- Proactive in solutions building, autonomous & reactive.
- Results-oriented, tech-savvy.
- Fluent in English; French is nice to have.
Technical skills :
- Cloud: Good knowledge with Microsoft Azure.
- Development: Strong development ability especially to develop frameworks or APIs.
- Infrastructure: Good knowledge of containerization. Kubernetes is a nice to have.
- DevOps: Excellent knowledge (including previous experience) of DevOps especially with :
- SDK & Development environments.
- Development dependencies management.
- Site Reliability Engineering & Monitoring.
- Smooth development collaboration (incl CI/CD blueprints).
- Automated code base compliance.
- Knowledge in data architecture and security are nice to have.
Tools & technologies :
- Integration: API gateway, ADF, A. ServiceBus, Event Grid.
- Serverless: Functions & Durable Functions.
- Languages: Python, nice to have: C#, JavaScript.
- Containers: Docker compose, network (bridge, overlay, IPvlan), volume/mounts, collect & manage metrics.
- DevOps tool: Azure DevOps, (or similar like Bitbucket).
- Infra as Code: Terraform, ARM.
- Monitoring: Application Insight, Log Analytics, nice to have: New Relic.
Opportunity based in Paris for a long term mission.
If you are interested, feel free to apply and make sure you update your resume, your availability for interviews and start, as well as your daily rate (open to portage company).