This job offer is no longer available
About
Technical/Business Skills: • Hands-on experience in building robust metadata-driven, automated data pipeline solutions leveraging modern cloud-based data technologies, tools for large data platforms. • Hands-on experience leveraging data security, governance methodologies meeting data compliance requirements. • Experience building automated ELT data pipelines, snowpipe frameworks leveraging Qlik Replicate, DBT Cloud, snowflake with CICD. • Hands-on experience building data integrity solutions across multiple data sources and targets like SQL Server, Oracle, Mainframe-DB2, files, Snowflake. • Experience working with various structured & semi-structured data files - CSV, fixed width, JSON, XML, Excel, and mainframe VSAM. • Experience using S3, Lambda, SQS, SNS, Glue, RDS AWS services. • Proficiency in Python, Pyspark, advanced SQL for ingestion frameworks and automation. • Hands-on data orchestration experience using DBT cloud, Astronomer Airflow. • Experience in implementing logging, monitoring, alerting, observability, performance tuning techniques. • Implement and maintain sensitive data protection strategies - tokenization, snowflake data masking policies, dynamic & conditional masking, and role based masking rules. • Strong experience designing, implementing RBAC, data access controls, adopting governance standards across Snowflake and supporting systems. • Strong experience in adopting release management guidelines, code deployment to various environments, implementing disaster recovery strategies, leading production activities. • Experience implementing schema drift detection and schema evolution patterns. • Must have one or more certifications in the relevant technology fields. • Nice to have Financial banking experience.
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.