This job offer is no longer available
About
Lead solution design, data analysis, and implementation of scalable data infrastructure using Snowflake, AWS, and Python. Develop ELT/ETL pipelines to support data movement and transformation across platforms. Collaborate with CRM Sales and cross-functional teams to integrate client hierarchies, territory logic, and segmentation rules. Support testing, production rollout, and quality execution of data initiatives. Optimize SQL queries and data ingestion performance. Contribute to DevOps practices including CI/CD pipeline setup and automation. Participate in Agile ceremonies and promote cloud-first methodologies. Mentor team members and foster a culture of continuous improvement and innovation. Required Qualifications
Bachelors or Masters degree in Computer Science, Engineering, or related field. 10+ years of experience in data engineering or related roles. 6+ years of experience in data warehousing and data mart implementations. 4+ years of experience developing ELT/ETL pipelines for Snowflake. 4+ years of experience with AWS services (EC2, IAM, S3, EKS, KMS, CloudWatch, CloudFormation). Strong programming skills in Python or Java. Proficiency in SQL and experience with query optimization. Experience with job scheduling tools (Control-M preferred). Strong data modeling skills (Dimensional or Data Vault). Experience with container technologies (Docker, Kubernetes). Familiarity with DevOps tools (Maven, Jenkins, Stash, Ansible). Experience working in Agile environments (Kanban, Scrum). Excellent interpersonal, analytical, and problem-solving skills. Preferred Qualifications
Advanced SQL/Snow SQL knowledge. Experience in CRM data integration and territory management. Background in financial services or enterprise data platforms.
#J-18808-Ljbffr
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.