Über
Design, develop, and maintain scalable ETL/ELT pipelines to support data processing and analytics.
Automate data ingestion and transformation workflows using APIs, scripting, and orchestration tools.
Implement automated health checks, data quality validation, and pipeline monitoring frameworks.
Utilize technologies such as Python, SQL, Snowflake SQL, and Databricks for data engineering and analysis.
Develop and maintain comprehensive documentation, including:
Architectural and technical diagrams
Job schedules and dependencies
Data dictionaries and schema descriptions
Validation and testing methodologies
Collaborate with cross-functional teams to ensure smooth integration with business systems.
Implement and validate data specifications and business rules to meet functional and compliance requirements.
Deliver accurate and timely data exports and reporting deliverables.
Resolve outstanding data and process backlog items within defined timelines.
Skills:
DATA PROCESSING,DOCUMENTATION,SQL
Wünschenswerte Fähigkeiten
- ETL
- Python
- SQL
- Scripting
- Databricks
Berufserfahrung
- Data Engineer
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.