Zuoraposted 2 days ago
$146,500 - $201,450/Yr
Full-time
Redwood City, CA

About the position

At Zuora, we do Modern Business. We’re helping people subscribe to new ways of doing business that are better for people, companies and ultimately the planet. It’s an approach resulting from the shift to the Subscription Economy that puts customers first by building recurring relationships instead of one-time product sales and focuses on sustainable growth. Through our leading expertise and multi-product suite, we are transforming all industries and working with the world’s most innovative companies to monetize new business models, nurture subscriber relationships and optimize their digital experiences. The Data Platform team at Zuora is driven by a mission to unlock the full potential of data assets. The team builds the foundational infrastructure and robust data pipelines that power everything from product insights to strategic business decisions. The work directly impacts the success of Zuora by providing accurate, timely, and accessible data to analytics, product, and business teams. This is a pivotal time to join as the company redefines its data architecture and embraces cutting-edge technologies.

Responsibilities

  • Enhance Data Pipeline Efficiency and Reliability: Designing and implementing robust data pipelines, reducing data processing latency and achieving a higher data quality.
  • Drive Data Accessibility and Self-Service: Collaborating with teams to define and implement data models and schemas, enabling seamless data integration and self-service analytics.
  • Optimize Cloud Data Infrastructure: Contributing to the design and implementation of scalable data storage solutions and improving data processing performance in the cloud environment.

Requirements

  • Designing, building, and maintaining production-grade data pipelines, including ETL/ELT processes and big data solutions.
  • Utilizing modern data engineering technologies such as Kafka, Spark, Flink, Trino, and Debezium for change data capture.
  • Working with cloud platforms (AWS, Azure, or GCP) and related services, including Kubernetes and Terraform.
  • Strong proficiency in programming languages like Java and Python, and expert-level SQL skills.

Nice-to-haves

  • Experience with Data Lake/Lake House solutions like Apache Iceberg.
  • Knowledge of modern data warehousing solutions such as Snowflake.
  • Familiarity with data governance principles and tools, including data cataloging and lineage tracking.
  • Experience with real-time analytics platforms or streaming architectures.
  • Hands-on experience with AI-powered development tools such as Cursor, GitHub Copilot, or Gemini AI to improve productivity and streamline coding workflows.

Benefits

  • Competitive compensation, variable bonus and performance reward opportunities, and retirement programs
  • Medical, dental and vision insurance
  • Generous, flexible time off
  • Paid holidays, 'wellness' days and company wide end of year break
  • 6 months fully paid parental leave
  • Learning & Development stipend
  • Opportunities to volunteer and give back, including charitable donation match
  • Free resources and support for your mental wellbeing
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service