ICF International-posted 6 days ago
$73,722 - $151,646/Yr
Full-time • Mid Level
Remote • Reston, VA
Professional, Scientific, and Technical Services

Our Health Engineering Solutions (HES) team works side by side with customers to articulate a vision for success, and then make it happen. We know success doesn't happen by accident. It takes the right team of people, working together on the right solutions for the customer. We are looking for a seasoned Data Engineer who will be a key driver to make this happen. Our team supports the vision to improve patient quality care and consumer decision-making by hospital providers across the country, by collecting, computing, and publicly reporting outcomes-based hospital quality measure data. This contract will serve to develop a human-centric reporting system that allows refinement, filtering, and data comparison, as well as the ability to view supplemental information. Leveraging modern technologies, DevOps practices, and cloud-based infrastructure, our dynamic work environment involves multiple project teams collaborating toward a common vision of delivering an integrated solution.

  • Design and build data processing pipelines using tools and frameworks in the AWS ecosystem.
  • Analyze requirements and architecture specifications to create a detailed design document.
  • Responsible for data engineering functions including data extract, transformation, loading, integration in support of modern cloud computing platforms like AWS.
  • Work with huge data sets and other Data Engineers and/or Scientists on analyzing this data using various algorithms.
  • Implement and configure big data technologies as well as tune processes for performance at scale.
  • Design and build ETL pipelines to automate the ingestion and data migration of structured and unstructured data.
  • Work with DevOps engineers on CI, CD, and IaC (Continuous Integration, Continuous Delivery, and Infrastructure as Code) processes; read specifications and translate them into code and design documents; and perform code reviews and develop processes for improving code quality.
  • Be proactive and constantly pay attention to the scalability, performance, and availability of our systems.
  • Responsible for deploying the developed solution in AWS environment and examine the results for accuracy.
  • Write complex unit and integration tests for all data processing code.
  • Perform code reviews and develop processes for improving code quality.
  • Bachelor's degree required (degree in Computer Science or related field preferred)
  • 5+ years of high-volume experience with Python, PySpark, the Spark Engine, and the Spark Dataset API
  • 2+ years of experience with Agile methodology
  • 2+ years of experience performing data pipeline and data validation
  • Candidate must be able to obtain and maintain a Public Trust Clearance
  • Candidate must reside in the U.S., be authorized to work in the U.S., and all work must be performed in the U.S.
  • Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years
  • SAS experience strongly preferred
  • MS and 5+ years of technical experience
  • Experience working in the healthcare industry with PHI/PII
  • Federal Government contracting work experience
  • Expertise working as part of a dynamic, interactive Agile team
  • Strong written and verbal communication skills
  • Prior experience working remotely full-time
  • 401k
  • health_insurance
  • paid_holidays
  • flexible_scheduling
  • professional_development
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service