À propos
Snowflake . Design, implement, and maintain robust data pipelines and
ETL processes Collaborate with Federated teams and other data engineers to understand data requirements and deliver high-quality data solutions. Ensure data integrity and security across all data workflows and storage solutions. Monitor and troubleshoot data pipelines, addressing any issues promptly to ensure the smooth flow of data. Develop reusable and modular
stored procedures
and scripts for data processing. Contribute to the development and implementation of data governance and best practices. Monitor and troubleshoot data pipelines to ensure reliability and accuracy. Implement best practices for data governance, data quality, and metadata management. Minimum Qualifications Bachelor’s or master’s degree in computer science, Engineering, or a related field. Minimum of 5 years of experience in data engineering or a related role. Proven experience with
Snowflake
is required. Knowledge of
data warehousing concepts ,
dimensional modeling , and
performance tuning . Hands-on experience with
ETL tools
(e.g., Informatica, Talend, dbt, or custom ETL frameworks). Strong proficiency in SQL and database management. Experience with cloud platforms such as AWS, Azure, or Google Cloud. Familiarity with version control
(Git) and CI/CD
for data pipelines. Familiarity with Big Data technologies such as Hadoop, Spark, and Kafka is a plus. Excellent problem-solving and analytical skills. Strong communication and collaboration abilities.
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.