À propos
Required Qualifications
Bachelors degree in Computer Engineering or a related field. 5+ years of experience as a Data Engineer in GCP, including Python, Java, Spark, and SQL. Expertise in Googles Identity and Access Management (IAM) API. Proficiency in Linux/Unix with strong scripting skills (Shell, bash). Experience with big data technologies such as HDFS, Spark, Impala, and Hive. Familiarity with version control platforms like GitHub and CI/CD tools such as Jenkins and Terraform. Proficiency with Airflow for workflow orchestration. Strong knowledge of GCP platform tools, including Pub/Sub, Cloud Storage, Bigtable, BigQuery, Dataflow, Dataproc, and Composer. Experience with web services and APIs (RESTful and SOAP). Hands?on experience with real?time streaming and batch processing tools like Kafka, Flume, Pub/Sub, and Spark. Ability to work with different file formats such as Avro, Parquet, and JSON. Expertise in pipeline creation, automation for data acquisition, and metadata extraction.
Preferred Qualifications
Coding skills in Scala. Knowledge of Apache packages and hybrid cloud architectures. Strong experience in API orchestration and choreography for consumer apps. Proven ability to collaborate with scrum teams and contribute to Agile processes using Jira and Confluence. Familiarity with Hadoop ecosystems and cloud platforms. Experience in managing and scheduling batch jobs and data quality control metrics.
#J-18808-Ljbffr
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.