This job offer is no longer available
About
Hybrid 3-4 days onsite in either Burbank, CA or Orlando, FL Our client seeks senior data engineers to build and refactor data pipelines supporting enterprise data collection, transformation, and delivery. The role centers on Snowflake and Snowpark with advanced Python and SQL to implement performant, secure solutions from defined requirements. The position also advances AI-assisted development practices using tools such as Cursor and Microsoft Copilot to accelerate delivery and improve code quality. Work occurs in an Agile environment with a focus on clear execution, migration from Azure Data Factory, and alignment to established standards. Due to client requirements, applicants must be willing and able to work on a w2 basis. For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance. Rate: $70.00 to $90.00/hr. w2 Responsibilities
Build and refactor data pipelines to support enterprise data collection, transformation, and delivery. Implement and support infrastructure that enables data storage, processing, and retrieval within Snowflake. Execute migration tasks to convert Azure Data Factory pipelines into Snowflake Snowpark solutions. Join and transform data from multiple source systems to produce datasets for reporting, dashboards, KPIs, and analytics consumption. Collaborate with management, architects, and team leads to deliver against clearly defined requirements and implementation instructions. Focus on execution and delivery, identifying issues or risks and escalating to leadership as needed. Manage assigned tasks and deliverables aligned to project timelines and leadership priorities. Use advanced tools and coding techniques to implement Snowflake Snowpark pipelines based on provided designs and standards. Apply senior-level data engineering skills to deliver reliable, performant Snowflake solutions that meet requirements. Leverage AI-assisted development tools for code generation, refactoring, debugging, and documentation within established standards. Validate AI-assisted outputs to ensure security, performance, and data governance requirements are met. Share AI usage patterns, efficiencies, and lessons learned to support responsible adoption across the team. Experience Requirements
Strong hands-on experience with Snowflake, including Snowpark and SQL-based transformations. Advanced Python experience for data engineering workloads. Advanced SQL experience, including complex transformations and performance tuning. Proven experience migrating ETL solutions between cloud platforms, specifically moving pipelines from Azure to Snowflake. Working knowledge of Azure Data Factory with ability to understand and replicate existing pipelines. Experience working with REST APIs in Python. Understanding of data security requirements with adherence to established policies and standards. Experience working in an Agile/Scrum environment. Use of AI-assisted tools such as Cursor and Microsoft Copilot to support development and migrations. Solid coding background with a data engineering focus.
#J-18808-Ljbffr
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.