Intuitive Surgicalposted 3 days ago
Senior
Sunnyvale, CA
Miscellaneous Manufacturing

About the position

We're looking for a Senior Software Engineer who is passionate about building a modern, scalable Data as a Service (DaaS) platform that powers Intuitive's Digital products and supports over 2,000 engineers across the organization. In this role, you will own and evolve critical components of our real-time and micro-batch data pipelines that power product development, internal tools, and analytics. Your work will focus on enabling high-throughput, low-latency data delivery through streaming pipelines, dynamic transformations, and APIs, driving discoverability, accessibility, and actionable insights. You will help define the architecture and engineering practices that support self-service analytics and operational decision-making on scale. As a catalyst for change, you will be at the forefront of reimagining how engineering teams consume and interact with data. Long-term success in this role means building robust, efficient systems and replacing legacy processes with modern solutions that allow teams to move faster, with greater confidence and autonomy.

Responsibilities

  • Design and build scalable, distributed Data as a Service that ingest, process, and serve data from robotics, manufacturing, engineering, and clinical sources in real time and batch modes
  • Develop and maintain robust APIs, data services, and tooling to provide internal teams with secure, efficient, and intuitive access to high-quality data
  • Partner with engineering, analytics, and business stakeholders to evolve data contracts and models that support emerging use cases and ensure semantic consistency
  • Implement CI/CD practices for data services, including automated testing for data quality, service reliability, and schema evolution
  • Champion a self-service data culture by building discoverable, well-documented data products and guiding teams toward empowered, autonomous data access
  • Act as a technical leader within the data domain driving best practices, mentoring teammates, and continuously improving how data is produced, shared, and consumed across the organization

Requirements

  • Solid quantitative background in Computer Science, Engineering, Physics, Math, or 8-10+ years of hands-on experience in a technically demanding role
  • Proficient in at least two major programming languages such as Python, Go, Scala, C++, or Java, with a strong understanding of software design and architecture
  • Proven experience building data pipelines and working with distributed systems using technologies like Apache Spark, Kafka, Elasticsearch, Snowflake, and Airflow
  • Strong collaborator who actively contributes to code reviews, system design discussions, sprint planning, and KPI evaluations to drive team excellence and technical quality

Nice-to-haves

  • Experience working on Data Platform or Infrastructure Engineering teams
  • Hands-on experience with AWS, Docker, Kubernetes, Kafka, Elasticsearch, Apache Airflow, Snowflake, and Terraform
  • Familiarity with CI/CD best practices for DataOps and deployment automation
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service