Über
Design and implement scalable data ingestion pipelines and ETL/ELT workflows Build and manage a modern lakehouse on AWS, optimizing performance and cost Establish a managed metadata repository and governance controls for data quality and lineage
Required Qualifications
Bachelor's degree in Engineering, IT, Computer Science, or related field (or equivalent experience) Minimum of four years of experience in building production data pipelines and/or data platforms Hands-on experience with data lake/delta lake (lakehouse) on AWS and data ingestion workflows Proficiency in SQL and one programming language commonly used for data engineering (Python preferred) Experience with automated AWS provisioning using Infrastructure as Code (IaC) and CI/CD pipelines
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.