DoorDash USAposted 1 day ago
$231,200 - $340,000/Yr
Full-time • Senior
San Francisco, CA

About the position

DoorDash is a data driven organization and relies on timely, accurate and reliable data to drive many business and product decisions. The Data Platform owns all the infrastructure necessary to run an operationally efficient analytical data stack. The Core Data part of this includes data ingestion (batch and real time), data compute & transformation, data storage (warehouse, data lake, OLAP etc.), querying infrastructure as well as data compliance, quality and governance. The adjacent areas of major focus are Machine Learning Infrastructure and workflow, Experimentation Platform, Knowledge Graphs and various Data Science and Analytics related tooling.

Responsibilities

  • Build data intensive solutions that are used by DoorDash engineers, data scientists, analysts or business users from across the company.
  • Drive and deliver the ongoing product vision of data products.
  • Guide engineers and management leadership on the right technical choices.
  • Mentor several senior engineers and hold a high bar of technical competency.
  • Set the right architectural patterns and handle build vs buy decisions.
  • Work with various vendors in the data solutions space.
  • Make judicious investments in the right areas anticipating future company needs.
  • Plan for long term strategy and engineering excellence.
  • Break down large systems into manageable, sustainable components.
  • Strive for continuous improvement of data architecture and development process.
  • Collaborate with stakeholders, external partners, and peer data leaders.

Requirements

  • Extensive experience building and operating scalable, fault-tolerant, distributed systems in the area of large scale data intensive applications.
  • Experience building data products and abstractions on top of infrastructure.
  • Familiarity with a range of large scale data systems such as data processing, complex/high volume real-time insights, data quality and reliability frameworks.

Nice-to-haves

  • Experience with backend service and tool development in Golang, Kotlin, and/or Python.
  • Familiarity with AWS.
  • Experience with Kubernetes, Docker, Kops, Helm.
  • Knowledge of Apache Kafka, Apache Flink, Apache Spark, Trino (Presto), Apache Airflow/Dagster, Apache Superset, AWS S3, Snowflake, Amplitude, CDP (e.g. Segment.com).

Benefits

  • 401(k) plan with an employer match.
  • Paid time off.
  • Paid parental leave.
  • Wellness benefits.
  • Paid holidays.
  • Medical, dental, and vision benefits.
  • Disability and basic life insurance.
  • Family-forming assistance.
  • Commuter benefit match.
  • Mental health program.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service