Dieses Stellenangebot ist nicht mehr verfügbar
Über
W2 Candidates Only — No C2C, No 3rd-Party Agencies Please
Summary Builds a comprehensive QA framework in Azure Databricks that will gather, store, detect, and process large volumes of data for QA analysis and detection of bugs found in ETL code.
Role Overview The Security Data Operations team is seeking a QA Data Engineer to design and develop a QA framework and pipeline for the Data Lakehouse. This role is a necessary component of the overall SDLC effort to centralize Information Security data and test the accuracy of storing and transforming the ETL process through the medallion architecture.
Responsibilities
Develop QA ETL pipelines, ensure data quality, and optimize data storage solutions.
Analyze and develop the ingestion pipeline to test data feeds across raw, transformation, and target process stages.
Function as a development resource in the Information Security Data Operations team and serve as a technical liaison between IT and Data Engineering teams.
Create a mechanism to generate JIRA tickets for developers to remediate and fix code bugs as they arise.
Build QA processes that are flexible, adaptable, and scalable.
Work with QA analysts to test code and data as part of the SDLC process.
Provide timely follow-ups and routine updates to the sprint schedule in adherence with internal IT SDLC agile processes.
Required Skills
Python
SQL
Azure Databricks
Azure Data Factory
AWS Glue
ETL / ELT
Data Lakehouse
Medallion Architecture
JIRA
Agile / SDLC
QA Frameworks
Cloud Data Tools
#J-18808-Ljbffr
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.