This job offer is no longer available
About
Required Technical Skills
Minimum 3 years of hands?on experience with Google Cloud Platform (GCP) specifically using Astronomer / Composer for orchestration. Strong proficiency in Python for data engineering and automation. Experience with RDBMS technologies such as DB2 and Teradata. Exposure to Big Data ecosystems and distributed data processing.
Nice to have Technical Skills
Prior experience with ETL tools like DataStage or Informatica.
Responsibilities
The Data Engineer will play a key role in developing and maintaining scalable data pipelines and workflows. The engineer will work with GCP tools like Astronomer / Composer and leverage Python for automation and transformation tasks. The role involves integrating data from RDBMS platforms such as DB2 and Teradata and supporting ETL processes using tools like DataStage or Informatica. The engineer will collaborate with existing team members including Software Analysts and Scrum Masters and will be expected to contribute to knowledge sharing and process improvement.
Specifically
Develop and implement solutions using GCP Python Big Data technologies to enhance data analysis capabilities. Collaborate with cross?functional teams to design and optimize data models in Teradata and DB2 environments. Utilize Python for scripting and automation to streamline geospatial data processing tasks. Integrate and manage data workflows using Cloud Composer to ensure efficient data pipeline operations. Leverage GCP Cloud to deploy scalable applications and services.
Key Skills
Apache Hive S3 Hadoop Redshift Spark AWS Apache Pig NoSQL Big Data Data Warehouse Kafka Scala
Employment Type: Full Time Vacancy: 1 #J-18808-Ljbffr
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.