XX
GCP Data Engineer with PythonSaranshUnited States

This job offer is no longer available

XX

GCP Data Engineer with Python

Saransh
  • US
    United States
  • US
    United States

About

Role: GCP Data Engineer with Python Location: Dearborn MI (4 days a week onsite) Job Type: Contract Experience: Overall 8 to 12 years Job Summary The Data Engineer will be responsible for supporting the Credit Global Securitization (GS) teams upskilling initiative by contributing to data engineering efforts across cloud and traditional platforms. This role is intended to accelerate development and delivery. The engineer will work closely with cross?functional teams to build, optimize and maintain data pipelines and workflows using GCP Python and ETL tools.
Required Technical Skills
Minimum 3 years of hands?on experience with Google Cloud Platform (GCP) specifically using Astronomer / Composer for orchestration. Strong proficiency in Python for data engineering and automation. Experience with RDBMS technologies such as DB2 and Teradata. Exposure to Big Data ecosystems and distributed data processing.
Nice to have Technical Skills
Prior experience with ETL tools like DataStage or Informatica.
Responsibilities
The Data Engineer will play a key role in developing and maintaining scalable data pipelines and workflows. The engineer will work with GCP tools like Astronomer / Composer and leverage Python for automation and transformation tasks. The role involves integrating data from RDBMS platforms such as DB2 and Teradata and supporting ETL processes using tools like DataStage or Informatica. The engineer will collaborate with existing team members including Software Analysts and Scrum Masters and will be expected to contribute to knowledge sharing and process improvement.
Specifically
Develop and implement solutions using GCP Python Big Data technologies to enhance data analysis capabilities. Collaborate with cross?functional teams to design and optimize data models in Teradata and DB2 environments. Utilize Python for scripting and automation to streamline geospatial data processing tasks. Integrate and manage data workflows using Cloud Composer to ensure efficient data pipeline operations. Leverage GCP Cloud to deploy scalable applications and services.
Key Skills
Apache Hive S3 Hadoop Redshift Spark AWS Apache Pig NoSQL Big Data Data Warehouse Kafka Scala
Employment Type: Full Time Vacancy: 1 #J-18808-Ljbffr
  • United States

Languages

  • English
Notice for Users

This job was posted by one of our partners. You can view the original job source here.