XX
Senior Data EngineerAnvetaUnited States

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Senior Data Engineer

Anveta
  • US
    United States
  • US
    United States

Über

Most Urgent Requirement
Needed only local candidates as client needed a candidate to interview in-person
The role requires hands-on experience with dbt and Apache Airflow deployed on Kubernetes, specifically within an on-prem OpenShift environment. This position involves closer interaction with infrastructure, including Kubernetes operations, Airflow design and implementation, and hands-on dbt model development in an on-prem setup. Given these requirements, we are looking for someone with deeper, practical experience in dbt and Airflow within Kubernetes-based, on-prem environments
Title: Senior Data Engineer - Airflow, DBT Core, Kubernetes/OpenShift -
Pay: keep max pay around $70/hr all-inclusive
Location: Onsite 3 days/week in Jersey City, NJ (185 Hudson St #1150, Jersey City, NJ 07311)
Start: ASAP Interview Process: 2 rounds, 1 virtual and 1 in person
*When submitting please make sure resume is 3 pages or less, this is required by BBH; also please include their LinkedIn profile with picture and full name *
Must have: -Python -Apache Airflow/DBT -Communication, both written & verbal -Kubernetes -OpenShift -8+ years of experience
Project: - This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads. They are implementing a new accounting system and merging data warehouses so they need a strong data engineer to come and design/engineer for optimization and streamline process automation -looking for someone to support the business and think strategically on how to improve what they have today, as well as do the administration hands-on work to drive the roadmap -this person will likely partner with the BBH cloud engineering team that supports OpenShift for the firm but they are limited to how much help, capital partners (this division) has their own business requirements vs the rest of the firm
Torc Job Type
Primary Timezone
timeOverlap
Job Description
We are seeking a highly skilled
Senior Data Engineer with 8+ years of hands-on experience
in enterprise data engineering, including deep expertise in Apache Airflow DAG development, dbt Core modeling and implementation, and cloud-native container platforms (Kubernetes / OpenShift).
This role is critical to building, operating, and optimizing scalable data pipelines that support financial and accounting platforms, including enterprise system migrations and high-volume data processing workloads.
The ideal candidate will have extensive hands-on experience in workflow orchestration, data modeling, performance tuning, and distributed workload management in containerized environments.
Key Responsibilities:
Data Pipeline & Orchestration
Design, develop, and maintain complex Airflow DAGs for batch and event-driven data pipelines Implement best practices for DAG performance, dependency management, retries, SLA monitoring, and alerting Optimize Airflow scheduler, executor, and worker configurations for high-concurrency workloads dbt Core & Data Modeling
Lead dbt Core implementation, including project structure, environments, and CI/CD integration Design and maintain robust dbt models (staging, intermediate, marts) following analytics engineering best practices Implement dbt tests, documentation, macros, and incremental models to ensure data quality and performance Optimize dbt query performance for large-scale datasets and downstream reporting needs Cloud, Kubernetes & OpenShift
Deploy and manage data workloads on Kubernetes / OpenShift platforms Design strategies for workload distribution, horizontal scaling, and resource optimization Configure CPU/memory requests and limits, autoscaling, and pod scheduling for data workloads Troubleshoot container-level performance issues and resource contention Performance & Reliability
Monitor and tune end-to-end pipeline performance across Airflow, dbt, and data platforms Identify bottlenecks in query execution, orchestration, and infrastructure Implement observability solutions (logs, metrics, alerts) for proactive issue detection Ensure high availability, fault tolerance, and resiliency of data pipelines Collaboration & Governance
Work closely with data architects, platform engineers, and business stakeholders Support financial reporting, accounting, and regulatory data use cases Enforce data engineering standards, security best practices, and governance policies Required Skills & Qualifications:
Experience
10+ years of professional experience in data engineering, analytics engineering, or platform engineering roles Proven experience designing and supporting enterprise-scale data platforms in production environments Must-Have Technical Skills
Expert-level Apache Airflow (DAG design, scheduling, performance tuning) Expert-level DBT Core (data modeling, testing, macros, implementation) Strong proficiency in Python for data engineering and automation Deep understanding of Kubernetes and/or OpenShift in production environments Extensive experience with distributed workload management and performance optimization Strong SQL skills for complex transformations and analytics Cloud & Platform Experience
Experience running data platforms on cloud environments Familiarity with containerized deployments, CI/CD pipelines, and Git-based workflows Preferred Qualifications
Experience supporting financial services or accounting platforms Exposure to enterprise system migrations (e.g., legacy platform to modern data stack) Experience with data warehouses (Oracle)
  • United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.