Cette offre d'emploi n'est plus disponible
À propos
Design, develop, and maintain scalable ETL/ELT pipelines to support data processing and analytics.
Automate data ingestion and transformation workflows using APIs, scripting, and orchestration tools.
Implement automated health checks, data quality validation, and pipeline monitoring frameworks.
Utilize technologies such as Python, SQL, Snowflake SQL, and Databricks for data engineering and analysis.
Develop and maintain comprehensive documentation, including:
Architectural and technical diagrams
Job schedules and dependencies
Data dictionaries and schema descriptions
Validation and testing methodologies
Collaborate with cross-functional teams to ensure smooth integration with business systems.
Implement and validate data specifications and business rules to meet functional and compliance requirements.
Deliver accurate and timely data exports and reporting deliverables.
Resolve outstanding data and process backlog items within defined timelines.
Skills:
DATA PROCESSING,DOCUMENTATION,SQL
Compétences idéales
- ETL
- Python
- SQL
- Scripting
- Databricks
Expérience professionnelle
- Data Engineer
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre a été publiée par l’un de nos partenaires. Vous pouvez consulter l’offre originale ici.