Job Requisition ID #
25WD86308
About Autodesk
Autodesk is a global leader in 3D design, engineering, and entertainment software, empowering innovators everywhere to shape a better world. From architects designing sustainable buildings to filmmakers creating immersive experiences, our technology fuels innovation across industries. Join us in building the future!
Position Overview
Are you passionate about solving technical challenges and building highly scalable, high-performance platforms? Do you thrive in an environment where your work directly impacts millions of users worldwide? If so, we want you on our team!
Autodesk’s Analytics Real-Time Data Platform team is looking for a Senior Software Engineer to help design, develop, and scale our streaming data platform. You’ll be working in a collaborative, agile environment, tackling complex engineering problems, and driving customer-focused innovation.
This is a unique opportunity to build a platform that supports real-time data ingestion for Autodesk products, helping us enhance the customer experience through data-driven insights.
Responsibilities
- Design, develop, and deploy highly available, scalable, distributed systems and microservices.
- Build and optimize real-time data pipelines using Kafka, Flink, AWS Kinesis Firehose, and other modern streaming technologies.
- Develop performance-driven code in Java or Python while adhering to best practices in software engineering.
- Own software components from design to deployment, ensuring test-driven development (TDD), automation, and high-quality engineering standards.
- Collaborate in an Agile team, participating in code reviews, architectural discussions, and sprint planning.
- Develop and improve observability, monitoring, and self-healing mechanisms for the platform.
- Work with architects and product managers to translate system architecture into well-designed, scalable software components.
- Optimize containerized applications using Docker, CI/CD frameworks (e.g., Spinnaker, Jenkins), and infrastructure as code (Terraform, AWS).
- Automate anything that can be automated—low tolerance for inefficiency is a must.
Minimum Qualifications
- 5+ years of experience in software engineering, working on large-scale, distributed systems.
- Strong proficiency in Java or Python, with experience in building scalable, high-performance applications.
- Hands-on experience with streaming data technologies like Kafka, Flink, AWS Kinesis Firehose.
- Expertise in cloud architectures, particularly AWS, and infrastructure as code (Terraform, CloudFormation).
- Experience with microservices, RESTful APIs, SDK development, and containerized applications.
- Proficiency in CI/CD pipelines, using Jenkins, GitHub, Artifactory, and related tools.
- Strong debugging, testing, and performance optimization skills.
- Ability to solve complex problems with creative and scalable solutions.
- Team-oriented mindset with a collaborative and innovative approach.