XX
Data Engineer - GCPEuclid InnovationsCharlotte, North Carolina, United States
XX

Data Engineer - GCP

Euclid Innovations
  • US
    Charlotte, North Carolina, United States
  • US
    Charlotte, North Carolina, United States

About

Key Responsibilities Design and build scalable ETL/data pipelines using Spark and PythonDevelop data workflows to ingest, transform, and move large datasetsImplement data routing logic to direct data to:
GCP (BigQuery, Dataflow, Dataproc) On-prem platforms (DPC)Ensure data quality, validation, and reconciliation across systemsCollaborate with data science and platform teams to support predictive model pipelinesOptimize performance and scalability for high-volume data processingRequired Skills • Strong hands-on experience with Apache Spark / PySpark for large-scale data processing• Proficiency in Python for data engineering (ETL pipelines)• Experience designing and developing data pipelines / data engineering workflows• Solid background in ETL, data ingestion, transformation, and data movement• Experience working with big data technologies and handling large datasets (batch/streaming)Experience with cloud platforms – GCP (Google Cloud Platform) o BigQuery, Dataflow, Dataproc, GCS (Google Cloud Storage)• Experience with data migration / data integration projects• Understanding of data pipeline architecture and distributed systems
  • Charlotte, North Carolina, United States

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.