This job offer is no longer available
About
Skywalk Global is seeking an AWS Data Engineer to develop and optimize data workflows and infrastructure using AWS services. The role involves building ETL jobs, managing data pipelines, and collaborating with engineers to enhance platform reliability and scalability.
Responsibilities : • Develop, maintain, and optimize AWS Glue ETL jobs using PySpark • Build and manage event-driven data workflows using AWS services • Design and deploy cloud infrastructure using AWS CDK (TypeScript) • Develop and maintain AWS Lambda functions and API integrations • Extend data pipelines to support new data sources and validation logic • Troubleshoot pipeline failures and improve data processing performance • Manage infrastructure deployments across environments using CI/CD pipelines • Work with Aurora PostgreSQL and relational data systems • Collaborate with engineers to enhance platform reliability, scalability, and automation • Participate in code reviews and infrastructure improvements
Qualifications : Required : • 3+ years of experience in software engineering or data engineering • Hands-on experience with AWS data and compute services • Experience with at least 2–3 of the following: AWS Glue, AWS Lambda, AWS CDK, PySpark, ETL pipeline development • Strong understanding of SQL and relational databases • Experience working with CI/CD pipelines • Proficiency with Git and collaborative development workflows
Company :
Skywalk Global offers IT consulting, application development, testing, and enterprise solutions. Founded in 2012, the company is headquartered in Des Moines, USA, with a team of 51-200 employees. The company is currently Growth Stage.
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.