About
Build and maintain ETL pipelines using tools like Informatica, Azure Data Factory, or Snowflake Write and optimize SQL queries for data extraction, transformation, and performance tuning Monitor daily data loads, troubleshoot failures, and resolve data issues quickly Work with business teams to understand reporting and data needs and translate them into technical solutions Clean and validate data, including handling duplicates, missing values, and inconsistencies Support and improve existing data processes, identifying opportunities for automation and efficiency Collaborate with other engineers and analysts on data models, workflows, and project delivery Maintain and update documentation around data pipelines, architecture, and processes Participate in testing and deployment of new data solutions and enhancements Strong experience with SQL and data engineering concepts including data warehousing and dimensional modeling Hands‑on experience with at least one ETL tool such as Informatica, Azure Data Factory, or Snowflake Ability to manage multiple priorities and work in a fast‑paced, collaborative environment
#J-18808-Ljbffr
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.