About
Skills, Experience, Qualifications, If you have the right match for this opportunity, then make sure to apply today.
Key Responsibilities: Design, develop, and maintain cloud-based data pipelines and ETL/ELT workflows. Build and optimize data architectures to support structured and unstructured data processing. Collaborate with data analysts, data scientists, and business stakeholders to understand data needs. Implement data quality, security, and governance best practices. Monitor and troubleshoot data workflows to ensure high availability and performance. Optimize database and data storage solutions for performance and cost efficiency. Contribute to cloud adoption, migration, and modernization initiatives. Mandatory Skills: Strong expertise with Azure cloud platform. Strong experience in Databricks Azure Data Factory proficiency required; building datasets, data flows, and pipelines in ADF (not just maintaining something already built) Hands-on experience with ETL/ELT tools and frameworks. Proficiency in SQL, Python, and data modeling. Knowledge of CI/CD pipelines and infrastructure-as-code tools. Understanding of data governance, security, and compliance. Preferred Skills: Exposure to API integration and microservices architecture. xywuqvp Strong analytical and problem-solving skills. Azure cloud certifications and/or past experience AKS (Azure Kubernetes Service) experience, and ETL related to applications containerized & deployed on AKS (or EKS)
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.