This job offer is no longer available
About
pipelines within the Microsoft Fabric ecosystem. The engineers will work to enhance data availability, readiness, and reliability, empowering other units to utilize the data effectively. By fostering data literacy, harmonization, and democratization, these roles will contribute to consistent and standardized decision-making across the organization.
Key Responsibilities:
• Design, build, and maintain scalable and reliable ELT pipelines on Microsoft
Azure Cloud
• Collaborate with architects, analysts, and client stakeholders to define
technical requirements
• Lead or contribute to end-to-end implementation of data solutions in
enterprise environments
• Assist in creating and maintaining technical documentation including
Architecture Diagrams, Solution Documents, and other related project
documentation
• Design, develop, and optimize complex Python and SQL queries.
• Maintain orchestration workflows and ensure real-time and batch data
processes meet performance expectations
• Design and Implement CI/CD
Requirements Experience:
- 7 to 10 years of strong technical experience in building and maintaining cloud data warehouses
- 7 years of experience in designing and building scalable and efficient data pipelines with Microsoft Fabric or Azure technologies (Data Factory, Synapse
Notebooks, Python, etc.) - Expert in Python and SQL
- Hands-on working experience with SSIS Package Design and Development
- Solid background in data modeling (dimensional/star/snowflake schemas) and experience with BI tools like Power BI
- Ability to work cross-functionally and translate complex business requirements into scalable data solutions
- Experience with CI/CD for data pipelines and infrastructure-as-code tools (e.g., Terraform)RFS B Intermediate Data Engineers
- Microsoft Azure Data Engineer Associate (DP-203) or equivalent
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.