XX
Lead Data EngineerJersey HiredUnited States
XX

Lead Data Engineer

Jersey Hired
  • US
    United States
  • US
    United States

À propos

Job Description
Job Description
Jersey Hired is scouting for a heavy-hitting Senior Data Engineer to join a global consulting powerhouse. This isn't just about moving data from Point A to Point B, it's about architecting a fortress.
We need a pro who can design, build, and operate secure, audited, and cost-efficient pipelines on Snowflake . You'll be the master of the journey: taking raw ingestion through Data Vault 2.0 models and delivering it into high-impact consumption layers. If you're a Terraform-wielding, dbt-loving, Airflow-orchestrating engineer who treats "audit-ready" as a lifestyle, we want to talk.
The Mission
* Architect & Build: Design scalable ingestion frameworks using Qlik, Glue, and ETLs. * Model at Scale: Implement Raw → DV 2.0 (Hubs/Links/Sats) → Consumption patterns in dbt Cloud with obsessive testing (uniqueness, relationships, freshness). * Snowflake Mastery: Build performant objects (tables, streams, tasks) and fine-tune clustering and micro-partitioning for peak efficiency. * Orchestrate Excellence: Author Airflow (MWAA) DAGs and dbt Cloud jobs that are idempotent, rerunnable, and strictly tracked against SLAs. * Secure the Perimeter: Enforce RBAC/ABAC, masking, and row-access policies. You'll operationalize controls that make auditors smile—think change management, separation of duties, and evidence capture. * Ops & Observability: Bake tests into dbt, monitor via ACCOUNT_USAGE, and forward metrics to Splunk/Datadog. * FinOps: Right-size warehouses and manage multi-cluster concurrency to keep performance high and costs low. * What You Bring to the Table
The Basics
* Bachelor's Degree + 6 years of advanced data engineering/enterprise architecture experience. * OR a High School Diploma/GED + 10 years of the same high-level experience.
Technical Must-Haves
* Snowflake Power User: Deep experience in secure account setup, storage integrations, Snowpipe, and cross-region replication. You understand the networking "under the hood" (AWS PrivateLink, VPC/DNS flows). * dbt Cloud Specialist: You know Dimensional and Data Vault 2.0 modeling, Jinja/macros, and the discipline of a DEV/QA/UAT/PROD promotion flow. * Airflow (MWAA) Expert: You've built modular DAGs, handled backfills, and know exactly when to use Airflow vs. dbt's native orchestration. * The Compliance Mindset: You've worked in regulated environments (SOX, GLBA, FFIEC, or PCI) and understand runbooks, PIR/RCAs, and audit log immutability. * Coding/Cloud: Advanced SQL, Python (ETL/Airflow), and AWS fundamentals (S3, IAM, CloudWatch).
Bonus Points (The "Cherry on Top")
* Experience with Snowflake Governance (Universal Search, masking automation). * Familiarity with Iceberg/External Tables or Kafka-driven ingestion. * Observability tools like Great Expectations, Monte Carlo, or Collibra . * Platform Engineering: Reusable Terraform modules, FinOps charge-back utilities, and service-account hardening. * BI/Semantic Layer: Designing metric layers for ThoughtSpot, Looker, or Power BI.
Does your code survive an audit and your pipelines never miss an SLA? Apply now and let's get to work.
  • United States

Compétences linguistiques

  • English
Avis aux utilisateurs

Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.