JPMorgan Chaseposted 21 days ago
Full-time • Mid Level
Wilmington, DE
Credit Intermediation and Related Activities

About the position

We have an exciting and rewarding opportunity for you to take your career to the next level. As a Software Engineer III -AWS/Data at JPMorgan Chase within the Corporate Sector-Consumer and Community Banking Risk team, you are part of an agile team that works to enhance, design, and deliver the data collection, storage, access, and analytics solutions in a secure, stable, and scalable way. As an emerging member of a data engineering team, you execute data solutions through the design, development, and technical troubleshooting of multiple components within a technical product, application, or system, while gaining the skills and experience needed to grow within your role.

Responsibilities

  • Organizes, updates, and maintains gathered data that will aid in making the data actionable
  • Demonstrates basic knowledge of the data system components to determine controls needed to ensure secure data access
  • Be responsible for making custom configuration changes in one to two tools to generate a product at the business or customer request
  • Build and develop automation tools
  • Troubleshoot priority incidents, facilitate blameless post-mortems and ensure permanent closure of incidents
  • Adds to team culture of diversity, equity, inclusion, and respect

Requirements

  • Formal training or certification on Data Engineering concepts and 3+ years applied experience
  • Demonstrate ability to work independently with strong ownership, collaboration & communication skills
  • Experience in data lifecycle and data management functions
  • Spark, Shell/ Perl Scripting and/ Python or Java
  • Hands on experience with AWS frameworks such as ECS, EKS, EMR
  • Significant experience with statistical data analysis and ability to determine appropriate tools to perform analysis
  • Basic knowledge of data system components to determine controls needed
  • Experience with SRE operations & principles

Nice-to-haves

  • Advanced knowledge of SQL (e.g., joins and aggregations) Python and Spark
  • Experience with workflow automation tools Control-M
  • Experience in maintaining and optimizing cloud-based infrastructure for enhanced performance and reliability
  • Experience with Apache Airflow
  • Experience with monitoring tools Grafana and Dynatrace
© 2024 Teal Labs, Inc
Privacy PolicyTerms of Service