Über
Responsibilities:
Design and implement robust ETL/ELT pipelines to ingest, transform, and load data from various sources. Work with relational and columnar databases including MySQL, PL/SQL, and Vertica to manage large datasets. Develop and schedule data workflows using Shell/Bash scripting and Python. Build and maintain data pipelines on Snowflake using ELT best practices. Collaborate with data analysts, data scientists, and other stakeholders to understand data needs. Optimize existing processes for scalability and performance. Use tools like Airflow, Fivetran, and DBT for orchestration, transformation, and data movement. Monitor data pipeline performance and resolve issues in a timely manner. Document processes, data flows, and technical architecture.
Required Skills & Qualifications:
Good experience in ETL/ELT development. Strong experience with MySQL and PL/SQL. Hands-on experience with Vertica and Snowflake. Proficient in Shell/Bash scripting and Python. Solid understanding of data modeling and SQL performance tuning. Familiarity with DBT (Data Build Tool) for transformation workflows. Basic knowledge of Fivetran and Apache Airflow. Experience working in an Agile environment. Excellent problem-solving and communication skills. Experience with Git version control. Exposure to CI/CD pipelines for data engineering. Knowledge of AWS cloud platforms. Familiarity with data governance and security best practices
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.