Company Description
Dynatrace provides software intelligence to simplify cloud complexity and accelerate digital transformation. With automatic and intelligent observability at scale, our all-in-one platform delivers precise answers about the performance and security of applications, the underlying infrastructure, and the experience of all users to enable organizations to innovate faster, collaborate more efficiently, and deliver more value with dramatically less effort. That’s why many of the world’s largest organizations trust Dynatrace️ to modernize and automate cloud operations, release better software faster, and deliver unrivaled digital experiences.
Job Description
Our Business Insights team is seeking a Lead Data Scientist to drive impactful data-driven decision-making across our products and operations. In this role, you’ll leverage your advanced expertise to design and develop machine learning solutions, uncover the causal relationships in complex systems, and help our customers better understand how their clients interact with their applications. This position requires someone comfortable with independently managing challenging projects and guiding others through technical decision-making.
Key Responsibilities and Impact
- Explore and analyse millions of rows of tabular data to uncover meaningful insights and build advanced machine-learning models.
- Design and implement causal analysis models to assess the impact of system performance on user experience, providing clear, actionable insights into customer behaviour.
- Develop and deploy machine learning models and workflows, transforming terabytes of traffic data into actionable insights that drive key business decisions.
- Lead the development and deployment of models, ensuring robustness, scalability, and reliability in production environments.
- Build automated solutions for business needs, such as bot detection using advanced machine learning and statistical methods.
- Collaborate closely with product owners, engineers, and other stakeholders to translate analytical findings into impactful features and product improvements.
- Take ownership of technical direction, contribute to architectural decisions, identify technical debt, and advocate for opportunities for improvement.
- Mentor and support junior data scientists, enhancing team productivity, improving code quality, and fostering a culture of collaboration and learning.
- Continuously monitor and enhance model performance in partnership with the engineering team, improving model impact on user experience and system effectiveness.
Qualifications
Minimum requirements:
- A degree in Engineering, Computer Science, Mathematics, or another quantitative field.
- 10+ years of demonstrable/ tenured experience in data/ data science, including at least 3 years in a lead or senior IC role.
- Expertise in causal analysis methods (e.g., propensity score matching, A/B testing, uplift modeling) with a demonstrated ability to analyse tabular data.
- Strong experience in Python (including Pandas, NumPy, and Scikit-Learn) for data processing and machine learning model construction.
- Proficiency in SQL, with the ability to write complex queries and optimise data retrieval from relational databases.
- Strong communication skills, with the ability to articulate complex technical concepts to non-technical stakeholders.
Desirable requirements:
- Experience with large language models (LLMs) and autonomous agents, with an understanding of their practical applications and limitations.
- Familiarity with big data technologies such as Spark or Snowpark for processing and analysing large datasets efficiently.
- Hands-on experience working with Snowflake, particularly using Snowpark for scalable data engineering and machine learning workflows.
- Experience with AWS services (e.g., S3, Lambda, EC2) for managing machine learning infrastructure and deploying models in a cloud-native environment.
- Hands-on experience with data visualisation tools like Plotly, Seaborn, or other Python-based libraries to convey data insights effectively.
- Familiarity with data pipeline orchestration tools (e.g., Airflow, Luigi) to manage ETL/ELT workflows.
- Ability to operate in a fast-paced, dynamic environment, effectively prioritising multiple projects with competing deadlines.
Additional Information
Expectation : All Insights team members are expected to travel at least 1 to 2 times per year for annual team meetings & events.