XX
GCP Big Data EngineerKastechPhoenix, Arizona, United States
XX

GCP Big Data Engineer

Kastech
  • US
    Phoenix, Arizona, United States
  • US
    Phoenix, Arizona, United States

About

Role: GCP Big Data Engineer Location: Phoenix, AZ – Hybrid/Onsite Duration: Long term Contract Job Description We are looking for a skilled
GCP Big Data Engineer with 14+ Years of Experience
to design, build, and maintain scalable data pipelines and big data solutions on Google Cloud. Responsibilities Develop ETL/ELT pipelines using BigQuery, Cloud Dataflow, and Dataproc Process large datasets using Apache Spark and SQL Optimize data workflows, performance, and cloud costs Collaborate with cross-functional teams for analytics and reporting solutions Ensure data quality, security, and reliability Required Skills Strong experience in Google Cloud Proficiency in Python, SQL, and Spark Hands-on experience with BigQuery, Dataflow, and Dataproc Knowledge of ETL, data warehousing, and cloud architectures Familiarity with CI/CD and DevOps tools is a plus Qualification Bachelor’s degree in computer science or related field GCP certifications. Experience 12+ years of experience in Data Engineering or Big Data technologies.
  • Phoenix, Arizona, United States

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.