This job offer is no longer available
About
Perform requirement gathering with the business users and SME and build a plan Perform ETL and data engineering work by leveraging multiple google cloud components using Cloud Dataflow, Cloud Data Proc, Google BigQuery Knowledge on Data Modelling and reporting using Google Cloud BigQuery Knowledge on building data pipelines leveraging GCP best methodologies Strong understanding towards Kubernetes, docker containers and to deploy GCP services Experience in writing code to extract, transform data from multiple data sources including Experience in ETL tools like Informatica or any ETL tools Experience in scheduling like Airflow, Cloud Composer etc. Experience in CI/CD automation pipeline facilitating automated deployment and testing Excellent verbal and written communication, problem solving and interpersonal skills. Experience in data catalog and metadata management. Experience with JIRA or any other Project Management Tools Deliver end to end comprehensive documentation along with code samples Experience in one or multiple scripting languages / Cloud solutions is a plus Key Accountabilities and Priorities:
Build a and operationalize pipelines to include data acquisition, staging, integration of new data sources, cataloging, cleansing, batch and stream processing, transformation, and consumption Independently work on assigned projects and foster a collaborative environment for a high-performing team Gather Business requirements, Review business priorities, Analyze options & risks Quickly Understand and formulate application level requirements pertaining to complex workflows, or integration with external applications Additional Information:
Bachelor's Degree. Masters is a plus. 8+ years' experience in Information Technology and/or in IT Professional Services.
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.