Über
Develop and maintain data ingestion pipelines from multiple sources Build ETL/ELT processes for data transformation Integrate structured and unstructured data into data platforms Implement incremental and automated data ingestion processes Ensure data quality through validations and monitoring Perform data cleaning and normalization tasks Optimize pipelines for performance and scalability Document data pipelines and data structures Collaborate with cross-functional teams to support data initiatives Requirements
Bachelor’s degree in Computer Science, Engineering, or related field Experience in data engineering or data pipeline development Knowledge of ETL/ELT processes and data integration concepts Experience with cloud-based data platforms (AWS or similar) Experience working with APIs and database integrations Proficiency in SQL and scripting languages (Python or similar) Familiarity with distributed data processing frameworks Nice to Have
Experience with AWS data services Knowledge of infrastructure as code tools Familiarity with data lake architectures Experience with Spark or similar technologies Experience with data pipeline orchestration tools Soft Skills
Analytical thinking Problem-solving skills Collaborative mindset Attention to detail
#J-18808-Ljbffr
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.