About
Design and implement scalable data ingestion pipelines and ETL/ELT workflows Build and manage a modern lakehouse on AWS, optimizing performance and cost Establish a managed metadata repository and governance controls for data quality and lineage
Required Qualifications
Bachelor's degree in Engineering, IT, Computer Science, or related field (or equivalent experience) Minimum of four years of experience in building production data pipelines and/or data platforms Hands-on experience with data lake/delta lake (lakehouse) on AWS and data ingestion workflows Proficiency in SQL and one programming language commonly used for data engineering (Python preferred) Experience with automated AWS provisioning using Infrastructure as Code (IaC) and CI/CD pipelines
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.