XX
Data EngineerATCNew York, New York, United States
XX

Data Engineer

ATC
  • US
    New York, New York, United States
  • US
    New York, New York, United States
Apply Now

About

About the Company

We are building a modern, event-driven data and logging platform to support product analytics, operational insights, customer-facing activity logs, and internal observability.

About the Role

We're looking for a Data Engineer who can design, build, and scale our pipelines across Google Cloud Platform, BigQuery, and Grafana/Loki-based systems. You will own the ingestion, modeling, transformation, and optimization of our Activity Logs, Email Logs, Integration Sync Logs, Application Logs, and operational analytics datasets. This role is highly cross-functional, working closely with backend engineers, DevOps/SRE, product, and analytics teams.

Responsibilities

  • Build & Maintain Logging & Event Pipelines
  • Design and maintain streaming pipelines using GCP Pub/Sub, Log Router, Cloud Logging, and Cloud Run / Dataflow.
  • Build structured ingestion pipelines for Activity Logs, Email Logs, Integration Sync Logs, and Application Logs.
  • Implement schema versioning, validation, and data quality enforcement.
  • Design Data Models for Logs & Analytics
  • Create partitioned and clustered BigQuery schemas optimized for customer-facing queries and internal analytics.
  • Convert raw Cloud Logging entries into structured, flattened analytic tables.
  • Optimize Cost, Performance & Reliability
  • Enforce BigQuery best practices (partitioning, clustering, TTL, materialized views).
  • Build monitoring for ingestion lag, schema drift, and cost anomalies.
  • Enable Dashboards & Insights
  • Prepare datasets for Looker Studio, Grafana, and other BI tools.
  • Build ELT pipelines for metrics like API hit counts, latency, error rates, and sync job summaries.
  • Collaborate with DevOps / SRE / Backend Teams
  • Work with DevOps to manage IAM, Pub/Sub, BigQuery infra, and observability stack.
  • Work with backend engineers to expose logs in the SaaS product UI.
  • Support SRE in debugging production performance using structured logs.

Qualifications

  • 3–7+ years of experience as a Data Engineer.
  • Strong experience with GCP Data Stack: BigQuery, Pub/Sub, Cloud Logging.
  • Excellent SQL skills.
  • Experience with streaming pipelines.
  • Python for pipeline development.
  • Experience with BigQuery optimization and data modeling.

Required Skills

  • Experience with Grafana Loki, Grafana Agent/Alloy, Fluent Bit.
  • Experience with ClickHouse, OpenSearch, or Elastic.
  • Familiarity with Looker Studio or other BI tools.
  • Experience in SaaS multi-tenant environments.

Equal Opportunity Statement

We are committed to diversity and inclusivity.

  • New York, New York, United States

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.