À propos
scalable and optimized ELT/ETL data pipelines
using the modern data stack (Snowflake, dbt, and orchestration tools like Airflow). Be the primary owner of
dbt (data build tool)
projects, including developing modular and tested data models, implementing data quality checks, and maintaining comprehensive documentation. Data Warehouse Management (Snowflake):
Implement
advanced data modeling
techniques (e.g., dimensional, Data Vault) within Snowflake to create clean, reliable datasets for analytics and business intelligence. Optimize Snowflake performance and cost efficiency
by tuning SQL queries, managing virtual warehouses, and utilizing advanced features like clustering, caching, and Snowpipe. SQL and Problem-Solving:
Write
complex, performant SQL
to transform raw data into high-value business metrics. Proactively identify and resolve complex data-related issues , bottlenecks, and performance degradation across the data ecosystem. Collaboration and Communication: Work closely with data analysts, business intelligence developers, and product teams to translate business requirements into technical data solutions. Provide technical guidance, participate in
code reviews
for dbt models and SQL logic, and champion data engineering best practices.
Compétences idéales
- Documentation
- Performance Tuning
- SQL
Expérience professionnelle
- Data Engineer
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.