Lyons Consulting Groupposted 3 days ago
Atlanta, GA
Professional, Scientific, and Technical Services

About the position

As a GCP Engineer, contribute to the migration of legacy data warehouse to a Google Cloud-based data warehouse for a Telecom Major. Collaborate with Data Product Managers and Data Architects to design, implement, and deliver successful data solutions. Help architect data pipelines for the underlying data warehouse and data marts. Design and develop very complex ETL pipelines in Google Cloud Data environments. Our legacy tech stack includes Teradata and the new tech stack includes GCP Cloud Data Technologies like BigQuery and Airflow. Languages include SQL and Python. Maintain detailed documentation of your work and changes to support data quality and data governance. Support QA and UAT data testing activities. Support deployment activities to higher environments. Ensure high operational efficiency and quality of your solutions to meet SLAs and support commitment to our customers (Data Science, Data Analytics teams). Be an active participant and advocate of agile/scrum practice to ensure health and process improvements for your team.

Responsibilities

  • Contribute to the migration of legacy data warehouse to a Google Cloud-based data warehouse.
  • Collaborate with Data Product Managers and Data Architects.
  • Design, implement, and deliver successful data solutions.
  • Help architect data pipelines for the underlying data warehouse and data marts.
  • Design and develop very complex ETL pipelines in Google Cloud Data environments.
  • Maintain detailed documentation of work and changes to support data quality and data governance.
  • Support QA and UAT data testing activities.
  • Support deployment activities to higher environments.
  • Ensure high operational efficiency and quality of solutions to meet SLAs.
  • Participate and advocate for agile/scrum practices.

Requirements

  • 8 years of data engineering experience developing large data pipelines in very complex environments.
  • Very strong SQL skills and ability to build very complex transformation data pipelines using custom ETL framework in Google BigQuery environment.
  • Exposure to Teradata and ability to understand complex Teradata BTEQ scripts.
  • Strong Python programming skills.
  • Strong skills on building Airflow jobs and debugging issues.
  • Ability to optimize queries in BigQuery.
  • Hands-on experience on Google Cloud data technologies (GCS, BigQuery, Dataflow, Pub/Sub, Data Fusion, Cloud Function).

Nice-to-haves

  • Experience with cloud data warehouse technology BigQuery.
  • Experience with Cloud technologies like GCP (GCS, Data Proc, Pub/Sub, Data Flow, Data Fusion, Cloud Function).
  • Exposure to Teradata.
  • Solid experience with job orchestration tools like Airflow and ability to build complex jobs.
  • Writing and maintaining large data pipelines using custom ETL framework.
  • Ability to automate jobs using Python.
  • Familiarity with data modeling techniques and data warehousing standard methodologies and practices.
  • Very good experience with code version control repository like GitHub.
  • Good scripting skills, including Bash scripting and Python.
  • Familiar with Scrum and Agile methodologies.
  • Problem solver with strong attention to detail and excellent analytical and communication skills.
  • Ability to work in Onsite/Offshore model and able to lead a team.

Benefits

  • Flexible work.
  • Healthcare including dental, vision, mental health, and well-being programs.
  • Financial well-being programs such as 401(k) and Employee Share Ownership Plan.
  • Paid time off and paid holidays.
  • Paid parental leave.
  • Family building benefits like adoption assistance, surrogacy, and cryopreservation.
  • Social well-being benefits like subsidized back-up child/elder care and tutoring.
  • Mentoring, coaching and learning programs.
  • Employee Resource Groups.
  • Disaster Relief.
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service