About
Key Responsibilities: • Solid experience with Database design practices and data warehousing concepts. • Design, develop, and maintain data pipelines and ETL processes to extract, transform, and load data from various sources into our data warehouse. • Hands on experience delivering data reporting requirements on On-Prem SQL Servers using SSIS and/or other integration processes. • Expertise with coding and implementing data pipelines in cloud-based data infrastructure in batch and streaming ETL using Azure Data Factory, Spark, Python, Scala on Databricks. • Experience working with Azure data lake storage and Azure SQL Databases. • Experience working with Databricks Delta Lake and Medallion Architecture and Unity Catalog. • Optimize data ingestion and transformation processes to ensure efficient and scalable data flows, Monitor and optimize database performance, including query tuning and index optimization. • Develop and maintain data models, schemas, and data dictionaries for efficient data storage and retrieval. • Perform data analysis, data profiling, and data cleansing to ensure data quality and accuracy, data validation, quality checks, and troubleshooting to identify and resolve data-related issues. • Develop Power BI dashboards, reports, and visualizations to effectively communicate data insights. Implement data transformations, data modeling, and data integration processes in Power BI. • Create and maintain technical documentation related to database structures, schemas, and Reporting solutions. • Experience with MS office tools like excel, word, power point. • Experience working with build and deploy tools Azure DevOps, GitHub, Jenkins. • Experience working on an Agile Development team and delivering features incrementally using Jira and Confluence. • Stay up to date with industry trends and best practices in data engineering, Power BI and proactively recommend and implement improvements to our data infrastructure. • Ability to multi-task, be adaptable, and nimble within a team environment. • Strong communication, interpersonal, analytical and problem-solving skills.
Preferred: • Experience with API/REST, JSON • Experience in Banking and Financial Domains. • Industry certification - Azure/GCP/AWS • Familiarity with a variety of programming languages, like Java, JavaScript, C/C++. • Familiarity with containerization and orchestration technologies (Docker, Kubernetes) and experience with shell scripting in Bash, Unix or windows shell is preferable. • Experience with Power Automate and Power Apps. • Experience with Collibra Data Lineage and Data Quality Modules.
Education: Bachelor's degree in computer science or information system along with work experience in a related field.
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.