XX
GCP Certified Data EngineerCloudIngestAtlanta, Georgia, United States

This job offer is no longer available

XX

GCP Certified Data Engineer

CloudIngest
  • US
    Atlanta, Georgia, United States
  • US
    Atlanta, Georgia, United States

About

GCP Data Engineer

Location:
Atlanta, GA (On-site/Hybrid as applicable)

Summary

We are seeking a highly skilled
GCP Data Engineer
to design, build, and optimize cloud-native data pipelines and analytics solutions on Google Cloud Platform. The ideal candidate has strong experience with
Python
,
BigQuery
,
Cloud Data Fusion
, and core GCP services such as
Cloud Composer
,
Cloud Storage
,
Cloud Functions
, and
Pub/Sub
. This role requires a strong foundation in
data warehousing concepts
and scalable data engineering practices.

Responsibilities

  • Design, develop, and maintain robust ETL/ELT pipelines on
    Google Cloud Platform
    .
  • Build and optimize data workflows using
    Cloud Data Fusion
    ,
    BigQuery
    , and
    Cloud Composer
    .
  • Write efficient and maintainable
    Python
    code to support data ingestion, transformation, and automation.
  • Develop optimized
    BigQuery SQL
    for analytics, reporting, and large-scale data modeling.
  • Utilize GCP services such as
    Cloud Storage
    ,
    Pub/Sub
    , and
    Cloud Functions
    to build event-driven and scalable data solutions.
  • Ensure data quality, governance, and reliability across all pipelines.
  • Collaborate with cross-functional teams to deliver clean, trusted, production-ready datasets.
  • Monitor, troubleshoot, and resolve performance issues in cloud data pipelines and workflows.

Must-Have Skills

  • Strong experience with
    GCP BigQuery
    (data modeling, SQL development, performance tuning).
  • Proficiency in
    Python
    for data engineering and pipeline automation.
  • Hands-on experience with
    Cloud Data Fusion
    for ETL/ELT development.
  • Working experience with key GCP services:
  • Cloud Composer
  • Cloud Storage
  • Cloud Functions
  • Pub/Sub
  • Strong understanding of
    data warehousing concepts
    , star/snowflake schemas, and best practices.
  • Solid understanding of cloud data architecture and distributed processing.

Good-to-Have Skills

  • Experience with
    Vertex AI
    for ML pipeline integration or model deployment.
  • Familiarity with
    Dataproc
    (Spark/Hadoop) for large-scale processing.
  • Knowledge of CI/CD workflows, Git, and DevOps best practices.
  • Experience with Cloud Logging/Monitoring tools.
  • Atlanta, Georgia, United States

Languages

  • English
Notice for Users

This job was posted by one of our partners. You can view the original job source here.