À propos
Role Overview:
Hands-on experience with Azure Services: Databricks, Data Factory and DevOps Proficient in Python, Apache Spark, and PySpark Write advance and complex SQL queries with solid experience in database management Develop and maintain data models, data warehouses, and data marts in Snowflake Automation, build re-usable transformations, performance tuning Streamline ETL complexity- logics and transformations Collaborate with data scientists, analysts, and stakeholders to gather requirements and define data models that effectively support business requirements Demonstrate decision-making, analytical and problem-solving abilities Strong verbal and written communication skills to manage client discussions Familiar with working on Agile methodologies - daily scrum, sprint planning, backlog refinement Key Responsibilities & Skillsets:
Azure Databricks ability to create data transformation logics Strong programming skills in Python and experience with SQL. Should be able to write complex SQL, Transact SQL, Stored Procedures ETL tool Azure data factory, Data Bricks Experience with Data Modelling DWH - Snowflake Excellent communication skills and stakeholder management Ability to work independently in IC role Good to have:
Knowledge on Snowflake platform Knowledge on PowerBI Familiarity with CI/CD practices and version control (e.g., Git) Familiarity with Azure DevOps
Compétences idéales
- Azure Data Factory
- Data Modeling
- PySpark
- Python
- SQL
Expérience professionnelle
- Data Engineer
- Data Infrastructure
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.