Über
Key Responsibilities • Design, develop, and maintain scalable ETL/ELT pipelines using Dataflow, Dataform, and BigQuery. • Implement data modeling, transformation, and orchestration best practices for analytics workflows. • Collaborate with data scientists, analysts, and business teams to deliver insights via Looker dashboards and reports. • Ensure data reliability, quality, and governance across multiple data sources. • Optimize BigQuery queries, partitions, and clustering for performance and cost efficiency. • Write Python scripts for automation, data processing, and integration with APIs. • Implement CI/CD practices for data pipeline deployments. • Monitor, troubleshoot, and improve data infrastructure performance and reliability. • Document data models, pipelines, and dashboards to ensure transparency and maintainability.
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.