Empowering enterprises to keep the planet habitable for all, Terrascope aspires to be the easiest carbon measurement and decarbonization platform for companies in the land, nature, and net-zero economy sectors.
Terrascope is a leading decarbonisation software platform designed specifically for the Land, Nature (LAN), and the Net-Zero Economy (NZE). As the easiest-to-use platform for these sectors, our comprehensive solution blends deep industry expertise with advanced climate science, data science, and machine learning. Terrascope enables companies to effectively manage emissions across their supply chains.
Our integrated platform offers solutions for Product and Corporate Carbon Footprinting, addressing Scope 3 and land-based emissions, SBTi FLAG & GHG Protocol LSR reporting, and supporting enterprise decarbonisation goals.
Publicly launched in June 2022, Terrascope works with customers across sectors, from agriculture, food & beverages, manufacturing, retail and luxury, to transportation, real estate, and TMT.
Terrascope is globally headquartered in Singapore and operates in major markets across APAC, North America, and EMEA. Terrascope is a partner of the Monetary Authority of Singapore’s ESG Impact Hub, a CDP Gold Accredited software provider, has been independently assured by Ernst & Young, and a signatory of The Climate Pledge to achieve Net Zero by 2040.
We are seeking a Senior Data & Analytics Engineer to design and implement scalable data architectures for SaaS platforms in both single-tenant and multi-tenant environments. This role will focus on leveraging the AWS Data Engineering stack, Postgres, and advanced analytics processing techniques, including the creation of materialised views to enable high-volume data analytics. The ideal candidate is skilled in Python scripting, Java or Go, and proficient in handling large-scale data processing workflows. This role will report into the Director of Engineering & Tech and will be crucial in shaping the future of climate-tech SaaS products.
In this role you will be responsible for:- Building a robust and scalable data platform to support seamless data onboarding and management for a SaaS platform.
- Designing scalable single-tenant and multi-tenant SaaS data platforms, optimize materialized views for analytics, and develop pipelines using the AWS Data Engineering stack.
- Designing and implementing efficient data migration scripts and workflows to enable smooth data onboarding for new and existing clients.
- Writing clean, efficient code in Python, Java, or Go, and design robust data models using Postgres ORM frameworks.
- Processing large-scale datasets, optimize Postgres databases for high performance, and implement best practices for scaling analytics solutions.
- Indexing optimization in Postgres for query performance.
- Creating materialized views and analytics-ready datasets for headless BI.
- Implementing row-level security and design multi-tenant database architectures for scalability and security.
- Developing pipelines and processes to integrate diverse data connectors into the SaaS platform while ensuring data integrity and consistency.
- Enabling data accessibility and transformation for data science teams by creating analytics-ready datasets and facilitating model integration.
- Ensuring the data platform and migration workflows are optimized for scalability, high performance, and low latency.
- Working closely with product, engineering, and data science teams to align platform capabilities with analytics and machine learning requirements.
- Managing and scaling AWS infrastructure and automate workflows using the GitHub DevOps stack.
To be successful in this role, you should have/ be:- Bachelor’s degree in STEM field.
- At least 5 to 8 years of extensive experience as a Data and Analytics engineer building data platforms for SaaS apps, large-scale data processing workflows, advanced analytics and processing techniques.
- Experience in database migration projects.
- Experience in build or migration of multi-tenant database projects.
- Deep competence in Python scripting for writing ETL pipelines, custom migration scripts, and automating AWS tasks.
- Deep competence in Java/Go for building high-performance, scalable tools to handle complex migration needs.
- Deep competence in data storage and management using AWS RDS(postgres), S3 and Document DB.
- Deep competence of Postgres database architecture and functionality, including indexes, partitioning, and query optimization.
- Deep competence in Materialized Views, including their creation, refresh strategies, and use cases for analytics.
- Advanced SQL skills to design complex queries that aggregate, filter, and transform data effectively for materialized views.
- Deep competence in data processing using AWS Glue, Lambda and Step functions.
- Have experience in AWS data migration service.
- Deep competence in data processing and analytics using AWS Athena and AWS Redshift.
- Deep competence in security and monitoring using AWS IAM, AWS CloudWatch and AWS CloudTrail.
- Experience in design mapping between MongoDB’s flexible schema and Postgres' relational schema.
- Experience in data enrichment and cleaning techniques.
- Proven experience with scalable, large data sets and high-performance SaaS applications.
- Strong ability to work with and optimize large-scale data systems.
- You are a data engineer with a strong background in building scalable analytics solutions in startup environments.
- You are passionate about creating efficient data processing systems and driving analytics innovation.
- You are a problem solver with excellent programming skills and a focus on performance optimization.
- You are a collaborative team player who enjoys mentoring peers and sharing best practices.
- Prior experience in startups and working with remote teams.
- Comfortable with change and ambiguity.
Even better if you are:- Familiar with Rust programming language.
- An entrepreneurial problem solver comfortable in managing risk and ambiguity.
We're committed to creating an inclusive environment for our strong and diverse team. We value diversity and foster a community where everyone can be their authentic self.