XX
Data EngineerUpRecruitLos Angeles, California, United States

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Data Engineer

UpRecruit
  • US
    Los Angeles, California, United States
  • US
    Los Angeles, California, United States

Über

About Our Client

Our client is a B2B SaaS platform transforming strategy and performance with
AI-driven workflows
and unified data insights. They're seeking a
Data Pipeline Engineer
to build, optimize, and scale the data ingestion and transformation layer that powers their flagship product.

The Role

Our client is seeking a
Data Pipeline Engineer
to own and enhance the core data infrastructure that powers their product ecosystem. This role is highly impactful — you will maintain ingestion pipelines, troubleshoot complex integration issues, optimize SQL workflows, and build reliable connections between internal systems and key third-party SaaS platforms.

This position is ideal for a data engineer who loves solving messy data problems, improving reliability, and building clean, scalable data models. You'll work closely with product leadership, engineering, and external tools to ensure the data foundation is robust and ready for high-volume growth.

What You'll Do

  • Maintain and improve data ingestion pipelines, including integrations built with Hotglue and Heroku
  • Troubleshoot and resolve schema mismatches, API limits, authentication errors, and connection issues
  • Build and optimize SQL-based ETL/ELT workflows, transformations, and views in PostgreSQL
  • Manage staging datasets, including anonymization and synthetic data generation
  • Define and implement core customer-facing metrics in partnership with product leadership
  • Develop and maintain third-party SaaS integrations (HubSpot, QuickBooks, Asana, etc.)
  • Support lightweight DevOps tasks including CI/CD workflows, performance tuning, and monitoring
  • Ensure reliability and scalability across a multi-tenant SaaS data architecture
  • Drive best practices for data quality, versioning, and pipeline observability

What We're Looking For

  • 3–6+ years of experience in data engineering, integrations, or ETL-focused roles
  • Deep SQL and PostgreSQL expertise (schema design, optimization, performance tuning)
  • Experience with ETL tools such as Hotglue, dbt, Airflow, Fivetran, or similar
  • Strong understanding of REST APIs, OAuth authentication, rate limiting, and webhook-driven integrations
  • Familiarity with Git, GitHub Actions, and modern CI/CD workflows
  • Experience working with SaaS data models and multi-tenant architectures
  • Strong problem-solving skills and a collaborative, product-oriented mindset

Bonus Experience

  • Knowledge of SOC2/GDPR compliance or secure data-handling practices
  • Experience generating synthetic datasets or anonymizing production data
  • Exposure to LLM/AI-powered workflows or data enrichment processes

Compensation

$150,000 + benefits

Full-time | Remote | No C2C

  • Los Angeles, California, United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.