About
We are looking for strong AWS Data Engineers who are passionate about Cloud technology. Your work will be to: Design and Develop Data Pipelines: Create robust pipelines to ingest, process, and transform data, ensuring it is ready for analytics and reporting. Implement ETL/ELT Processes: Develop Extract, Transform, Load (ETL) or Extract, Load, Transform (ELT) workflows to seamlessly move data from source systems to Data Warehouses, Data Lakes, and Lake Houses using Open Source and AWS tools. Worked on DBT (Data Build Tool) to perform transformations on Snowflake data. Worked on Extracting, Transforming and Loading data into Snowflake. Created complex queries to pull data from multiple source tables and created testing scripts to verify the data quality accuracy in the target tables Adopt DevOps Practices: Utilize DevOps methodologies and tools for continuous integration and deployment (CI/CD), infrastructure as code (IaC), and automation to streamline and enhance our data engineering processes. Design Data Solutions: Leverage your analytical skills to design innovative data solutions that address complex business requirements and drive decision-making
Nice-to-have skills
- AWS
- DevOps
- ETL
- SQL
Work experience
- Data Engineer
- Data Infrastructure
- DevOps
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.