About the Team
Come help us build the world's most reliable on-demand, logistics engine for delivery! We're bringing on talented engineers to help us create and maintain a 24x7, no downtime, global infrastructure system that powers DoorDash’s three-sided marketplace of consumers, merchants, and dashers.
About the Role
Data is at the foundation of DoorDash success. The Data Engineering team builds database solutions for various use cases including reporting, product analytics, marketing optimization and financial reporting. By implementing dashboards, data structures, and data warehouse architecture; this team serves as the foundation for decision-making at DoorDash.
DoorDash is looking for an Analytics Engineer to build and scale data models, pipelines, and self-service analytics across the organization. In this role, you’ll focus on developing a reliable aggregation layer and reporting structure that meets our growing business needs, enabling teams to access and analyze data independently.
You’re excited about this opportunity because you will…
- Design, develop, and maintain robust data models to support analytical and product data needs across the organization
- Collaborate with data engineers, data scientists, and business stakeholders to understand data requirements and translate them into scalable data solutions
- Implement and optimize ETL/ELT processes to ensure data quality, reliability, and performance
- Own and define business KPIs, their measurement plans, data requirements and reporting
- Build processes to ensure correct, timely and reliable reporting
- Address ad-hoc reporting requirements and find pathways for automation
- Build and enforce common design patterns to increase report reusability, readability and standardization
- Build visually appealing, high-performing, and impactful reporting/dashboard products using tools like Tableau/Sigma across large data sets
We’re excited about you because…
- 3+ years experience working in business intelligence, data analytics, Data engineering or a similar role
- Strong SQL skills and experience with data modeling techniques (e.g., dimensional modeling, 3 Nf, data vault)
- Proficiency in a programming language such as Python or Scala
- Experience building reporting and dashboarding solutions using data lake/Snowflake or similar ecosystem
- Expert in Database fundamentals, SQL and performance tuning
- Excellent communication skills and experience working with technical and non-technical teams
- Comfortable working in fast paced environment, self starter and self organizing
- Ability to think strategically, analyze and interpret market and consumer information
- Nice to Haves:
- Experience with real-time data processing and streaming technologies
- Experience with modern data warehousing platforms (e.g., Snowflake, DataBricks, Redshift) and knowledge of data visualization tools (e.g., Looker, Tableau).
- Familiarity with machine learning concepts and their data requirements