This job offer is no longer available
About
Required Skills: Big Data Engineer with mapreduce, spark, hive, SQL skillset. Expert in SQL and data warehousing concepts. Hands-on experience with public cloud data warehouse (GCP, Azure, AWS). GCP certification will be very good to have. MapR experience is must. Strong hands on experience with one or more programming languages (Python or Java). Hands-on expertise...
Nice-to-have skills
- Data Warehousing
- Hive
- Python
- SQL
- Spark
Work experience
- Data Engineer
- Data Infrastructure
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.