About
Skills Google Cloud Platform (GCP) Tools:(Preferred) - BigQuery, Cloud Storage, Dataflow, Cloud Functions, Pub/Sub, Cloud Run, Cloud Composer (Airflow), Cloud Spanner, Bigtable. Container Orchestration - Kubernetes (preferred on GKE) and Helm for managing and deploying containerized applications. CI/CD and Automation- Jenkins for building CI/CD pipelines to automate deployment and testing of data pipelines. Programming Languages- Proficient in Python for data processing and automation, SQL for querying and data manipulation. Experience with Java is a plus. DevOps Tools - Familiarity with Terraform or Deployment Manager for Infrastructure as Code (IaC) to manage GCP resources. Monitoring and Logging -Experience with Cloud Monitoring, Datadog, or other monitoring solutions to track pipeline performance and ensure operational efficiency. Data Engineering Skills Expertise in ETL/ELT pipelines, data modeling, and data integration across large datasets. Strong understanding of data warehousing and real-time data processing workflows. Strong communication skills to work effectively with cross-functional teams and mentor junior developers. Proven ability to lead in an Agile environment. 3+ years of experience as a data engineer, with hands-on experience in Kubernetes, Helm, Python and Jenkins. Strong experience building and optimizing data pipelines and services in any cloud platform. Proficiency in Python and SQL. Familiarity with Java and Docker is a plus.
Nice-to-have skills
- Google Cloud Platform
Work experience
- Data Engineer
- Data Infrastructure
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.