This job offer is no longer available
About
Design, develop, and maintain scalable ETL/ELT pipelines to support data processing and analytics.
Automate data ingestion and transformation workflows using APIs, scripting, and orchestration tools.
Implement automated health checks, data quality validation, and pipeline monitoring frameworks.
Utilize technologies such as Python, SQL, Snowflake SQL, and Databricks for data engineering and analysis.
Develop and maintain comprehensive documentation, including:
Architectural and technical diagrams
Job schedules and dependencies
Data dictionaries and schema descriptions
Validation and testing methodologies
Collaborate with cross-functional teams to ensure smooth integration with business systems.
Implement and validate data specifications and business rules to meet functional and compliance requirements.
Deliver accurate and timely data exports and reporting deliverables.
Resolve outstanding data and process backlog items within defined timelines.
Skills:
DATA PROCESSING,DOCUMENTATION,SQL
Nice-to-have skills
- ETL
- Python
- SQL
- Scripting
- Databricks
Work experience
- Data Engineer
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.