XX
Data EngineerQuantix, Inc.United States
XX

Data Engineer

Quantix, Inc.
  • US
    United States
  • US
    United States
Postuler Maintenant

À propos

Title:
Senior to Expert-level Data Engineer

Location:
100% Remote

Tiem Zone:
CST

Hours:
Approximately 50 hours/week

Type:
Direct hire

Salary range:
180 – 220K

Job Description:
We are looking for a senior to expert-level Data Engineer who will play a key role in accelerating innovation by delivering robust data solutions. You will be responsible for designing and implementing a scalable
GCP-native data
strategy that underpins our machine learning initiatives and enables decentralized, squad-owned data infrastructure. Working closely with industry-leading engineers and scientists, you will help achieve large-scale behavior change through intelligent, data-powered systems. Your work will involve crafting reusable, high-fidelity data products and building infrastructure that allows for rapid iteration, continual improvement, and measurable outcomes.

Requirements:

  • 8+ years of professional experience in data engineering or a related field, with demonstrated expertise in building and deploying scalable data solutions
  • Strong skills in critical thinking, decision making, problem-solving, and attention to detail
  • Technically skilled, and able to understand technology tradeoffs that your squad will face
  • Proficient at resolving ambiguity. When there is uncertainty, you can work with squad colleagues to define the path forward
  • Able to work independently, operating without significant input or guidance
  • Expert-level proficiency in
    Python and SQL
    for scalable data transformation and strategic analysis within the squad's domain
  • Expertise in designing, building, and operating data products, not just pipelines, that deliver compounding value and adhere to domain-driven data principles
  • Architect and govern the data storage strategy within the squad, strategically utilizing transactional systems
    (e.g., AlloyDB), Operational Data Stores (ODS), and analytical data warehouses
    , with a primary focus on
    BigQuery
  • Mastery of
    Google BigQuery
    for strategic data product development and high-volume analytical processing, coupled with deep hands-on experience integrating with transactional databases such as
    AlloyDB (PostgreSQL) and Cloud SQL (PostgreSQL)
  • Experience modernizing legacy data assets and optimizing high-performance SQL/procedural logic, including exposure to proprietary
    SQL dialects (e.g., TSQL, PL/pgSQL),
    demonstrating an ability to decouple logic from operational databases to
    BigQuery/Dataform
  • Extensive experience architecting ingestion strategies using native services like
    Pub/Sub and Datastream (CDC)
    for high-throughput data delivery
  • Mandatory deep expertise in the strategic native GCP stack: GCP Data Ecosystem: including Dataform for transformations, Cloud Composer (Airflow) for orchestration, and Cloud Dataflow (Apache Beam) for processing
  • Experience applying
    Dataplex
    features for data governance, quality, and discovery across the domain's data products
  • Proven success collaborating across engineering, product, and science teams to deliver squad-owned data products in a fast-paced, iterative environment
  • Highly motivated and organized, demonstrating an advanced ability to influence technical direction and build effective partnerships across internal and external stakeholders
  • United States

Compétences linguistiques

  • English
Avis aux utilisateurs

Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.