Oct 01, 2024 - TechBiz Global GmbH is hiring a remote Data Engineer. Location: Germany.
TechBiz Global is a leading recruitment and software development company. Our diverse, globally distributed team provides IT recruitment, outstaffing, outsourcing, software development, and different consulting services with a primary focus on making our partners achieve their business goals successfully.
With headquarters in Germany, we have successful clients all over the world. We can understand your unique needs. Our team has hands-on experience with the challenges that come with rapid growth and the IT sector. That’s why all of our offerings are built with a tech mindset.
About the role:
The Data Engineer will collaborate closely with other Data Engineers, Software Development Engineers (SDEs), Analytics Engineers (AEs), Data Scientists (DSs), and Infrastructure Engineers (IEs) in the technology team. This role will report directly to the Director of Data Engineering of the company.
Responsibilities include - but are not limited to:
- Collaborate closely with other Data Engineers, Software Development Engineers (SDEs), Analytics Engineers (AEs), Data Scientists (DSs), and Infrastructure Engineers (IEs) in the technology team.
- Focus on building a modular, flexible data engineering infrastructure that we can iteratively improve on continuously. Your customers will include eCommerce, marketing, supply chain, finance, business development, and legal.
- Understand that our data is a strategic resource, and will build data products to support every aspect of our business.
- Build our data model(s) in addition to our data infrastructure in collaboration with the Director of Analytics and Head of Infrastructure Engineering.
Requirements
Technical Expertise
Data Architecture and Modeling:
- Experience working with data architectures for scalable and performant systems.
- Experience with dimensional data modeling, OLAP systems, and data warehousing concepts (star/snowflake schema, etc.).
- Strong experience with GCP.
- Experience with Big Query, Looker Studio.
- Experience with NoSQL databases (e.g., DynamoDB, Cassandra) and SQL databases (e.g., PostgreSQL, MySQL).
- Design, implement, and optimize ETL pipelines for efficient data movement and transformation.
- Hands-on experience with workflow management tools such as Apache Airflow.
- Proficiency in Python and other relevant programming scripting languages for data processing.
- Strong SQL expertise for querying large datasets.
- Knowledge of API integration for extracting data from multiple sources (internal and external systems such as Amazon's API for FBA, sales data, etc.).
DevOps and Automation:
- Experience with CI/CD pipelines, automation scripts, and infrastructure-as-code tools (e.g., Terraform, CloudFormation).
- Ability to implement data versioning and data quality checks.
Data Privacy:
- Strong understanding of data privacy and compliance regulations like GDPR or CCPA.
- Ensure data governance best practices, including encryption, access controls, and audit trails.
Monitoring and Observability:
- Expertise in implementing data quality checks, monitoring data pipelines, and ensuring that systems are reliable and efficient.
- Proficiency with monitoring tools for real-time visibility of data flows.
- Experience with ecommerce/retail/supply chain data modeling is a big plus.
- Experience with Amazon seller ecosystem is a big plus.
Communication Skills
- Ability to solve complex data challenges, propose innovative solutions for scaling and optimizing data architecture, and proactively drive improvements.
- Ability to adapt to fast-paced, constantly evolving scale-up environment.
- Capable of handling well-defined projects and prioritizing based on business impact.
- Proficient in English.