Dieses Stellenangebot ist nicht mehr verfügbar
Über
refined and formatted job description
for your "Senior Snowflake Data Engineer" opening in Santa Clara, CA. Let me know if you'd like further customization or a version suitable for posting on job boards like LinkedIn or Indeed.
Title:
Senior Snowflake Data Engineer Location:
Santa Clara, CA 95054 (Onsite, 5 days/week) Interview:
In-person required (Local candidates only)
###
Position Overview
We are seeking a highly skilled
Senior Snowflake Data Engineer
to join our data engineering team in Santa Clara, CA. The ideal candidate will possess deep technical expertise in Snowflake and Databricks, a strong foundation in modern data architecture, and a proven track record building and optimizing large-scale data pipelines. This is a fully onsite role requiring a local candidate able to attend in-person interviews.
###
Key Responsibilities
-
Data Engineering & Architecture: Design, build, and optimize scalable data pipelines using Snowflake and Databricks. Apply modern data architecture principles for robust and reliable data solutions.
-
ETL/ELT Development: Develop and maintain ETL/ELT workflows for data ingestion, transformation, validation, and loading using advanced SQL and scripting.
-
Data Modeling: Implement and optimize dimensional models (star and snowflake schemas) and ensure efficient storage and retrieval.
-
Cost Optimization: Optimize compute and storage usage across Snowflake and Databricks to balance performance and cost.
-
Administration: Manage database design, schema evolution, user roles, permissions, and access control policies.
-
Data Quality & Monitoring: Implement data lineage, quality, and monitoring frameworks to ensure reliability and accuracy.
-
Reporting & Visualization: Collaborate with BI teams and develop solutions using tools such as Power BI and Sigma Computing.
-
Collaboration: Work cross-functionally to define requirements, drive technical initiatives, and communicate effectively with both technical and non-technical stakeholders.
###
Key Requirements
- Hands-on experience in building and optimizing data pipelines using
Snowflake
and
Databricks . - Strong knowledge of data architecture, including ingestion, transformation, storage, and access control. - Experience in system design and solution architecture with a focus on scalability and reliability. - Expertise in ETL/ELT pipeline development and automation. - Advanced skills in SQL for data processing and transformation. - Deep understanding of data modeling (dimensional, star, and snowflake schemas). - Experience optimizing compute and storage costs in cloud environments. - Administration skills: database design, schema management, user roles, permissions, and security. - Experience implementing data lineage, data quality, and monitoring frameworks. - Familiarity with reporting/visualization tools (Power BI, Sigma Computing). - Excellent communication skills and ability to work independently and collaboratively.
Note: This position is onsite in Santa Clara, CA (5 days/week). Only local candidates will be considered. In-person interviews required.
Interested? If you meet the above qualifications and are passionate about building high-performance, large-scale data solutions, we encourage you to apply!
Let me know if you need a version with a company overview or application instructions!
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.