About
Key Responsibilities • Design, develop, and maintain scalable ETL/ELT pipelines using Dataflow, Dataform, and BigQuery. • Implement data modeling, transformation, and orchestration best practices for analytics workflows. • Collaborate with data scientists, analysts, and business teams to deliver insights via Looker dashboards and reports. • Ensure data reliability, quality, and governance across multiple data sources. • Optimize BigQuery queries, partitions, and clustering for performance and cost efficiency. • Write Python scripts for automation, data processing, and integration with APIs. • Implement CI/CD practices for data pipeline deployments. • Monitor, troubleshoot, and improve data infrastructure performance and reliability. • Document data models, pipelines, and dashboards to ensure transparency and maintainability.
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.