Über
Hybrid in Oak Park Heights, MN (3 days onsite per week) Duration:
3-6 months with potential to convert Rate:
$67.61-$70.55, dependent on skills and qualifications Tech Stack:
Snowflake, dbt, Fivetran, Azure (ADO, Blob, Functions etc) Role Overview We are seeking a seasoned Senior level data Engineer to lead technical execution and provide architectural guidance across our onshore and offshore engineering teams. In this role, you will be the primary technical point of contact, ensuring that complex data requirements are translated into scalable, resilient, and highly optimized solutions. You will not only build but also influence the standards for a modern data stack centered on Snowflake and Azure, driving "Governance as Code" and operational excellence.
Key Responsibilities
1. Technical Leadership & Cross-Shore Guidance
Engineering Anchor: Act as the primary technical lead for distributed teams, ensuring clarity in requirements and maintaining high standards of execution across time zones.
Architectural Blueprinting: Engage with Product Owners and Solution Architects to design optimal data product pipelines that serve as the foundational reference for the broader engineering team.
Resilience Engineering: Design systems for high availability and fault tolerance, ensuring the data platform can recover gracefully from upstream failures.
Modern Data Stack Mastery: Engineer and optimize full-lifecycle data pipelines using Fivetran, Snowflake, and dbt, focusing on large-scale, complex datasets.
Metadata-Driven Automation: Design and implement config-driven or metadata-driven pipelines to increase development velocity and reduce manual overhead.
Layered Frameworks: Apply advanced modeling techniques (Data Vault, Dimensional/Star Schema) to create high-performance, curated, reusable core datasets and purpose-built datasets optimized for analytics and AI.
3. Performance & Cost Optimization
Snowflake Expert: Apply advanced proficiency in Snowflake performance tuning (clustering, warehouse profiling, query optimization) to minimize both latency and Azure consumption costs.
End-to-End Efficiency: Monitor and tune the entire flow from ingestion to transformation to ensure the stack remains performant as data volumes scale.
4. Governance, Security & DataOps
Governance as Code: Implement and validate automated data lineage, quality checks, and data classification within the CI/CD workflow.
Observability & Health: Drive platform reliability by implementing end-to-end observability; proactively monitor data health and enforce rigorous quality gates using dbt.
Azure Integration: Manage and optimize data flows within the Azure ecosystem, leveraging Azure DevOps (ADO), Blob Storage, and Azure Functions.
Required Qualifications
Experience: 8–10 years of experience building and optimizing large-scale, complex data architectures and pipelines.
Core Stack: Expert-level command of Snowflake, dbt, and Fivetran.
Cloud Infrastructure: Strong proficiency in Azure services (Storage, Compute, and DevOps/CI/CD).
Modeling: Proven ability to engineer layered data frameworks using various modeling methodologies (e.g., Data Vault 2.0).
Leadership: Experience guiding offshore teams and conducting technical code reviews to ensure consistency and adherence to patterns.
Preferred "Good to Have" Skills
AI/ML Enablement: Experience in building data foundations that enable Machine Learning and Generative AI use cases (e.g., Vector databases, feature stores).
Advanced Governance: Experience with automated data privacy/masking and advanced metadata cataloging.
#J-18808-Ljbffr
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.