The Options Clearingposted 17 days ago
$113,700 - $195,400/Yr
Full-time • Mid Level
Hybrid • Chicago, IL
Securities, Commodity Contracts, and Other Financial Investments and Related Activities

About the position

The candidate will be responsible for designing and delivering scalable and resilient hybrid and Cloud-based applications and data solutions supporting critical financial market clearing and risk activities; helping to drive the strategy of transforming the enterprise into a data-driven organization; lead through innovative strategic thinking in building data solutions. You will be part of the Data team, a diverse group of dedicated engineers who are very passionate about data. As the Data team member, you will be responsible for crafting and building cloud-based applications and data systems that will serve as the backbone for enterprise data management and analytics capabilities. You will join the core team responsible for the design, development, and implementation. You will work closely with internal and external business and technology partners. We will define system architecture, technology stack and its tactical implementation. You will help us take on unique technical challenges associated with handling large datasets and managing streaming data in public cloud and hybrid environments, build large and complex data pipelines, integrate data coming from diverse sources in different formats, implement continuous integration/continuous delivery pipelines, automate everything we can get our hands on, and be part of the API-first design principal implementation. You will have a rare and challenging opportunity to apply your technical skills, knowledge and experience, acquire new skills and grow with us.

Responsibilities

  • Design and deliver scalable and resilient hybrid and Cloud-based applications and data solutions.
  • Drive the strategy of transforming the enterprise into a data-driven organization.
  • Craft and build cloud-based applications and data systems.
  • Define system architecture and technology stack.
  • Handle large datasets and manage streaming data in public cloud and hybrid environments.
  • Build large and complex data pipelines.
  • Integrate data from diverse sources in different formats.
  • Implement continuous integration/continuous delivery pipelines.
  • Automate processes where possible.
  • Participate in API-first design principal implementation.

Requirements

  • 5+ years of hands-on experience with Java.
  • Strong knowledge of SQL.
  • 5+ years of hands-on experience with Big Data and distributed stream processing frameworks such as Hadoop, Kafka, Hive, and Presto.
  • 3+ years of hands-on experience with stream processing engines such as Apache Storm, Spark, Flink, or Beam.
  • Experience working with Cloud ecosystems (AWS, Azure, GCP).
  • Experience with data storage formats such as Apache Parquet, Avro, or ORC.
  • Knowledge and understanding of DevOps tools and technologies such as Terraform, Git, and Jenkins.
  • Familiarity with Kubernetes and container orchestration technologies such as Rancher, EKS, or GKE.
  • Good understanding of data integrations patterns, technologies, and tools.

Nice-to-haves

  • Experience with table formats such as Iceberg, Delta Lake, or Hudi is a plus.
  • Work experience in the capital markets, preferred.
  • AWS Certified Solution Architect Associate Level is a plus.

Benefits

  • A hybrid work environment, up to 2 days per week of remote work.
  • Tuition Reimbursement to support your continued education.
  • Student Loan Repayment Assistance.
  • Technology Stipend allowing you to use the device of your choice to connect to our network while working remotely.
  • Generous PTO and Parental leave.
  • 401k Employer Match.
  • Competitive health benefits including medical, dental and vision.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service