Dieses Stellenangebot ist nicht mehr verfügbar
Über
Apple's Artificial Intelligence and Data Platforms (AiDP) team is seeking an experienced Software Engineer to build high quality, scalable and resilient distributed systems that power Apple's cloud analytics platforms and data pipelines. Apple's Enterprise Data Warehouse landscape caters to a wide variety of real-time, near real-time and batch analytical solutions. These solutions are an integral part of business functions like Sales, Operations, Finance, AppleCare, Marketing and Internet Services, enabling business drivers to make critical decisions. We use proprietary and open source technologies, Kafka, Spark, Iceberg, Airflow, Presto, etc. If you are looking to take on infrastructure problems at scale, both on-prem or in cloud, focusing on ease of use, ease of maintenance and most importantly implement solutions that are scalable, you will enjoy working in AiDP
Description
We engineer high-quality, scalable and resilient distributed systems on cloud that power data exploration, analytics, reporting and production models. Our core systems are diverse and come with an unusual intersection of high data volumes with systems distributed across cloud and on-premise infrastructure.
This role will build solutions that integrate open source software with Apple's internal ecosystem. You will drive development of new components and features from concept to release: design, build, test, and ship at a regular cadence. You will work closely with internal customers to understand their requirements and workflows, and propose new features and ecosystem changes to streamline their experience of using the solutions on our platform.
This is a challenging software engineering role, where a large part of an engineer's time is spent in writing code and designing/developing applications on cloud, with the remainder being spent on tuning and debugging codebase, supporting production applications and supporting our application end users. This role requires in-depth knowledge of innovative technologies and cloud data platform with the ability to independently learn new technologies and contribute to the success of various initiatives.
Minimum Qualifications
Knowledge of BI concepts and implementation experience on Cloud with databases like Snowflake or Big Query
Programming experience in building high-quality software with at-least one of the following programming languages - Python, Scala or Java.
Experience in developing highly optimized SQLs, procedures & semantic process for distributed data applications.
Bachelor's degree in Computer Science or equivalent experience
Preferred Qualifications
3 or more years of experience building enterprise-level data applications on distributed systems
Hands-on experience in designing and development of cloud-based applications that include compute services, database services, APIs to design RESTful services, ETL, queues and notification services.
Experience in cloud data warehousing platforms like Snowflake is highly valued
Experience developing Big Data applications using Java, Spark, Kafka is a huge plus
Understanding of fundamentals of object-oriented design, data structures, algorithm design, and problem solving
Cloud technology experience on platforms like AWS, Microsoft Azure, Google Cloud
Data Visualization Tools: experience in software such as Streamlit, Superset, Tableau, Business Objects, and Looker
Data Insights and KPIs: Working experience on generating and visualizing data insights, metrics, and KPIs. Usage of basic ML models in the space of anomaly detection, forecasting, GenAI.
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.