Über
Be one of the first applicants, read the complete overview of the role below, then send your application for consideration.
Our client seeks a senior Data Engineer to design, build, and maintain operational and analytical capabilities across modern data platforms. The role focuses on Snowflake, AWS, and Python to enable scalable data lakes and warehousing. You will drive solution design, data analysis, production rollout, and support while shaping a growing data infrastructure. Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance. Rate: $55.00 to $75.00/hr. w2 JN - Responsibilities Design and implement scalable data solutions on Snowflake and AWS for data lake and warehouse workloads. Build and maintain ELT/ETL pipelines to move and transform data to and from Snowflake. Perform data analysis, modeling, and profiling to support analytics and operational use cases. Optimize SQL and Snowflake performance, including query tuning and cost efficiency. Develop automation and CI/CD pipelines to enable reliable deployments and operations. Leverage AWS services such as EC2, IAM, S3, EKS, KMS, CloudWatch, and CloudFormation to operate data platforms. Use Python or Java for data engineering, orchestration, and utility development. Implement scheduling and orchestration using enterprise tools. Apply container technologies such as Docker and Kubernetes for packaging and runtime. Collaborate in Agile teams to improve efficiency and deliver high-quality data products. Experience Requirements 10+ years of experience with a Bachelor’s or Master’s degree in a technology-related field. 6+ years in data warehousing and data mart concepts and implementations. 4+ years building ELT/ETL pipelines with Snowflake. 4+ years using AWS services including EC2, IAM, S3, EKS, KMS, CloudWatch, and CloudFormation. 1+ years with object-oriented programming in Python or Java. Advanced SQL or SnowSQL knowledge. Hands-on SQL query optimization and performance tuning. Experience with job scheduling tools such as Control-M. Proven data analysis and data modeling skills, including Dimensional or Data Vault. Experience with Docker and Kubernetes. Experience with DevOps, CI/CD, and related tooling such as Maven, Jenkins, Stash, Ansible, and Docker. Experience with Agile methodologies such as Kanban or Scrum (preferred). Ability to handle ambiguity and work in a fast-paced environment. Effective interpersonal skills to collaborate with multiple teams. Strong Snowflake and AWS expertise, including S3, Lambda, and CloudFormation. Python scripting experience supporting data engineering workflows. Leadership experience (preferred). Education Requirements Bachelor’s or Master’s degree in a technology-related field such as Engineering or Computer Science. xywuqvp AWS-related certifications (preferred).
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.