- +2
- +10
- Texas, United States
À propos
Greetings from Abode Techzone, LLC!
We are looking for a best match for one of our client's urgent requirements mentioned below.
Let me know if you would be interested to move ahead, if yes please share your
UPDATED RESUME
with your
Hourly Rates
expectations along with below detail
Work Authorization:
Hourly Rates
expectations :
Current location:
Current project would over when: mention time/already over
Relocation : Yes /No
Job Description:
Job Title:
GCP IBM Streams Data Engineer
Location
: Irving, TX | Basking Ridge, NJ | Temple Terrace, FL | Alpharetta, GA | Piscataway, NJ | Colorado Springs, CO | Ashburn, VA | Lone Tree, CO
Duration
: 12 months Contract
No of new vacancies to be filled: 2
I
nterview
: Three rounds of video interview
Rate : Hourly Rate is based on your experience, skillset, telecom domain and large complex projects.
Skillset:
IBM Streams, Apache Flink, Google Cloud Platform, Linux, Apache Airflow, Google Compute Engine, GitLab, BigQuery, Python, Pyspark, Spark & Kafka
Job Description :
Must have
5 + years o f experience as GCP Data Engineer.
The Data Engineer will be responsible for developing and supporting database applications to drive automated data collection, storage, visualization and transformation as per business needs.
Understand the existing IBM streams pipelines
Able to write the Custom templates in Apache Data flow (Apache Beam)
Migrate the IBM Stream pipelines to Data Flow
Able to handle Billions of records / day volume in real-time streaming systems and write the pipelines in Flink / Dataflow over GCP
Able to perform the Quality check on the data (Data validation)
Work experience over the CICD like jenkins and construct them over the cloud Database design, Data Modelling and Mining.
Consolidate data across multiple sources and databases to make it easier to locate and access.
Implement automated data collection and data storage systems.
Write complex SQL queries and stored procedures.
Work with multiple data systems and large relational databases.
Compétences idéales
- Apache Flink
- Google Cloud Platform
- Linux
- Gitlab
- Python
- PySpark
- Spark
- Kafka
- SQL
- Jenkins
Expérience professionnelle
- Data Engineer
- Data Infrastructure
Compétences linguistiques
- English