The Trade Deskposted 1 day ago
$136,300 - $249,900/Yr
Full-time • Senior
San Jose, CA

About the position

The Trade Desk is a global technology company with a mission to create a better, more open internet for everyone through principled, intelligent advertising. Handling over 1 trillion queries per day, our platform operates at an unprecedented scale. We have also built something even stronger and more valuable: an award-winning culture based on trust, ownership, empathy, and collaboration. We value the unique experiences and perspectives that each person brings to The Trade Desk, and we are committed to fostering inclusive spaces where everyone can bring their authentic selves to work every day. Do you have a passion for solving hard problems at scale? Are you eager to join a dynamic, globally-connected team where your contributions will make a meaningful difference in building a better media ecosystem? Come and see why Fortune magazine consistently ranks The Trade Desk among the best small- to medium-sized workplaces globally.

Responsibilities

  • Design, build, and maintain data pipelines for processing content catalog data using technologies like Apache Spark for the Ventura OS project
  • Develop efficient data processing workflows that can handle large-scale data with optimal performance
  • Create robust APIs that securely expose data services to other parts of our platform
  • Collaborate with data scientists to integrate with their models and algorithms in production environments
  • Build scalable data integration solutions that connect various data sources and destinations
  • Implement monitoring and observability for data pipelines to ensure reliability and performance
  • Optimize data storage and retrieval mechanisms for our specific use cases
  • Contribute to architectural decisions around data infrastructure
  • Work closely with cross-functional teams to understand business requirements and translate them into technical solutions
  • Apply best practices for data engineering, including data quality checks, error handling, and recovery mechanisms

Requirements

  • Extensive experience with data warehousing solutions like Vertica, Snowflake, or Databricks to support large-scale data operations
  • Strong knowledge of data governance and data quality frameworks to ensure accuracy, consistency, and compliance in data pipelines
  • Hands-on experience with machine learning pipelines and MLOps, particularly in personalization, content recommendations, ad-tech solutions, or e-commerce optimization within streaming or TVOS platforms
  • Expertise in content metadata processing and management, ideally within a streaming, TVOS, or e-commerce environment
  • Familiarity with entertainment catalog systems, taxonomy structures, and content discovery workflows
  • Prior work on recommendation systems or search technologies, enhancing user experiences through AI-driven content surfacing
  • Strong ability to communicate effectively across technical and non-technical stakeholders, ensuring alignment on data strategies and priorities
  • A solid foundation in computer science and engineering fundamentals, enabling you to design scalable, high-performance systems
  • Deep experience with distributed systems and data processing frameworks, working with large-scale datasets in a high-traffic environment
  • Proficiency in one or more programming languages such as Java, Scala, or Python for building and maintaining robust data pipelines
  • Strong expertise in big data technologies such as Apache Spark, Kafka, and other scalable data processing tools
  • Solid understanding of data modeling concepts and experience with both relational and NoSQL database systems
  • Hands-on experience with cloud platforms like AWS, GCP, or Azure, and containerization technologies such as Kubernetes
  • Proven ability to lead and complete projects with high levels of technical ambiguity while mentoring and contributing to the growth of peers
  • 11+ years of software development experience, with at least 5 years specializing in data engineering or pipeline development
  • A Bachelor’s or Master’s degree in Computer Science, Engineering, or a related field, or equivalent industry experience

Benefits

  • Comprehensive healthcare (medical, dental, and vision) with premiums paid in full for employees and dependents
  • Retirement benefits such as a 401k plan and company match
  • Short and long-term disability coverage
  • Basic life insurance
  • Well-being benefits
  • Reimbursement for certain tuition expenses
  • Parental leave
  • Sick time of 1 hour per 30 hours worked
  • Vacation time for full-time employees up to 120 hours thru the first year and 160 hours thereafter
  • Around 13 paid holidays per year
  • Employees can also purchase The Trade Desk stock at a discount through The Trade Desk’s Employee Stock Purchase Plan
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service