XX
Senior Snowflake Data EngineerConfieUnited States

Cette offre d'emploi n'est plus disponible

XX

Senior Snowflake Data Engineer

Confie
  • US
    United States
  • US
    United States

À propos

Pay Range $140000 - $160000 / year
Our Perks
Generous PTO plans sick pay and health benefits Annual bonus based on employment standing Work from home and hybrid model employment Confie Enablement Fund / Scholarship Program I-Care Recognition Program Corporate Social Responsibility Program Diversity Equity and Inclusion Initiatives Confie Hub and Discount Programs (Gym Membership)
Purpose
Work under the guidance and supervision of the Director Enterprise Architecture to build Confies next-generation Enterprise Data Solutions. Responsible for developing robust data models creating efficient ELT processes and optimizing performance to support the organizations data needs. Requires expertise in designing implementing and maintaining data solutions on Snowflake data cloud environments that drive critical business insights and operations. Essential Duties & Responsibilities
Design and develop data pipelines ETL workflows to populate the cloud Data Lake and Data Warehouse on Snowflake with a transformation tool (e.g. Coalesce WhereScape Azure Data Factory) and replication tools (e.g. Fivetran Airbyte etc.) Translate business requirements into technical specifications; establish and define details definitions and requirements of applications components and enhancements. Design and develop robust and scalable data pipelines to support data integrations using Snowflake Coalesce Python Airflow and Fivetran. Design and develop Snowflake data objects (tables views stored procedures UDFs etc.). Implement ELT (Extract Load Transform) processes using Snowflakes features such as Snowpipe Streams and Tasks. Perform data cleaning analysis and integration using Python. Ability to work with multiple data sources and types (structured / semi-structured / unstructured). Design and implement efficient data models and schemas within Snowflake to support reporting analytics and business intelligence needs. Monitor and optimize query performance and resource utilization within Snowflake using query profiling query optimization techniques and workload management features. Identify and resolve performance bottlenecks and optimize data warehouse configurations for maximum efficiency. Implement processes and systems to monitor data quality ensuring production data is always accurate and available for key stakeholders and business processes that depend on it. Monitor data pipelines for timely and accurate completion. Stay up-to-date with industry trends and advancements in data engineering continuously improving the teams technical knowledge and skills. On-call support. Qualification and Education Requirements
6 years of professional experience in data engineering designing and implementing data pipelines and building data infrastructure. 5 years of strong experience required in Snowflake data cloud and ETL development including Snowflake procedures UDFs in Python and SQL streams tasks Snowpipe and working with semi-structured data etc. 5 years of strong experience with Python programming and extensively used frameworks / packages like Snowpark pandas numpy and requests for Data Analysis and integration. Solid understanding of data warehousing concepts dimensional modeling and data integration techniques. 5 years of strong experience with Data Integration & transformation tools like Coalesce WhereScape and Azure Data Factory. Experience with Databricks Google BigQuery and AWS Redshift is a plus. Experience with data quality and observability concepts is a plus. Experience with cloud platforms (e.g. AWS Azure GCP) and cloud-based data technologies is a plus. Skills
A Love for All Things Data - The backbone of a good data engineer is to understand the life cycle of data movements from source to final interpretations on a report A Passion to Learn - Strong desire and ability to learn new tools skills and acquire knowledge Listening Skills - The ability to understand what people say Analytical Skills - The ability to critically evaluate the information from multiple sources and break down high-level information into details Observation Skills - The ability to validate data obtained via other techniques and expose new areas for elicitation Organizational Skills - The ability to work with the vast array of information gathered during analysis and to cope with rapidly changing information Interpersonal Skills - The ability to help set priorities Oral and Written Skills - Excellent written and verbal communication with little to no supervision Critical thinking and problem-solving skills Confidence in communicating and translating data-driven insights and technical concepts into simple terminology for business clients of various levels Ability to work and communicate effectively with any level of the user community
Key Skills
Apache Hive,S3,Hadoop,Redshift,Spark,AWS,Apache Pig,NoSQL,Big Data,Data Warehouse,Kafka,Scala Employment Type
Full-Time Department / Functional Area
Administration Vacancy
1 #J-18808-Ljbffr
  • United States

Compétences linguistiques

  • English
Avis aux utilisateurs

Cette offre a été publiée par l’un de nos partenaires. Vous pouvez consulter l’offre originale ici.