This job offer is no longer available
About
Job Responsibilities 10+ years of experience in analysis, coding, implementation and testing of Big Data Applications (API & Batch) on AWS Cloud using java/scala/kotlin, Cassandra, , Python, Hadoop, HDFS, Spark, MapReduce framework Good Scala, Pyspark, Java coding experience for REST API and SOAP webservices Good AWS experience (Batch and API microservices) using EMR/EC2/ECS/S3/Airflow/Step Functions, AWS API Gateway Good Batch Application performance tuning (Spark/Hadoop/EMR, Java, python) Must have Linux experience (Shell scripting/Python)
Required Qualifications Experience in analysis, coding, implementation and testing of Big Data Applications (API & Batch) on AWS Cloud using java/scala/kotlin, Cassandra, , Python, Hadoop, HDFS, Spark, MapReduce framework Good Scala, Pyspark, Java coding experience for REST API and SOAP webservices Good AWS experience (Batch and API microservices) using EMR/EC2/ECS/S3/Airflow/Step Functions, AWS API Gateway Good Batch Application performance tuning (Spark/Hadoop/EMR, Java, python) Must have Linux experience (Shell scripting/Python)
Preferred Qualifications Additionally: Good to have DevOps experience (GIT, GitHub, Bit Bucket, GitLab, IntelliJ IDEA, PyCharm) Good to have experience creating CI/CD pipelines - Maven, Gradle, Jenkins, Artifactory, AWSCodeCommit Cloudformation
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.