XX
AWS Data Engineer- RemoteFreelanceJobsCanada
XX

AWS Data Engineer- Remote

FreelanceJobs
  • CA
    Canada
  • CA
    Canada

À propos

Position: AWS Data Engineer
Location: Remote
Work Timings: 10 am to 7 pm IST
Client: Salesforce India
JD:
We are seeking a highly skilled and experienced Data Engineer with deep expertise in the AWS ecosystem to build, manage, and optimize our data pipelines. The ideal candidate will be a hands-on builder with a strong focus on data orchestration and transformation, capable of integrating various AWS services to create a robust data platform.
Key Responsibilities:
Orchestration: Design, develop, and maintain data workflows using Apache Airflow on MWAA (Amazon Managed Workflows for Apache Airflow) to automate and manage complex ETL processes.
Data Transformation: Write, test, and deploy AWS Glue jobs with a strong emphasis on Scala (and some Python) for large-scale data cleansing, transformation, and enrichment.
Data Storage & Management: Manage and optimize data stored in Amazon S3 for our data lake, and design and implement efficient data models within Amazon DynamoDB for high-performance data access.
Pipeline Integration: Seamlessly integrate data sources from external APIs and platforms, such as Salesforce, into our AWS data architecture.
Monitoring & Operations: Implement robust monitoring and alerting using AWS CloudWatch to ensure the health, performance, and reliability of our data pipelines.
Ecosystem Expertise: Demonstrate a comprehensive understanding of how all these components—MWAA, Glue, S3, DynamoDB, and CloudWatch—interface with one another to form a cohesive and scalable data solution.
Required Skills & Qualifications:
5 years of experience in building and managing production-level data pipelines on AWS.
Expertise in MWAA/Apache Airflow for workflow orchestration.
Strong proficiency in Scala for developing data transformation jobs, preferably with AWS Glue.
Experience with Pyspark is a plus.
In-depth knowledge of Amazon DynamoDB and Amazon S3.
Familiarity with pulling data from various sources, including external APIs and enterprise systems like Salesforce.
Proficiency in AWS CloudWatch for monitoring and logging.
Excellent problem-solving skills and a strong ability to debug and troubleshoot complex data issues.
Regards,
Rachna
Yaza Group
Contract duration of less than 1 month. with 30 hours per week.
Mandatory skills: ETL Pipeline, SQL, ETL, Data Migration, AWS Lambda, Amazon RDS, AWS CloudFormation, PySpark, Amazon Redshift, Database Design, Amazon Athena, AWS Glue, Big Data, Apache Hadoop, Amazon Web Services
  • Canada

Compétences linguistiques

  • English
Avis aux utilisateurs

Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.