À propos
We lead with integrity, and we emphasize work/life balance for all of our teammates. How will you make an impact in this role? There are hundreds of opportunities to make your mark on technology and life at American Express. Here’s just some of what you’ll be doing: Develop modern, high quality, and robust operational engineering capabilities. Develop software in technology stack which is constantly evolving but currently includes Big data platforms like BigQuery, Airflow, DataProc. Work with Business partners and stakeholders to understand functional requirements, architecture dependencies, and business capability roadmaps. Create technical solution designs to meet business requirements. Define best practices to be followed by team. Taking your place as a core member of an Agile team driving the latest development practices Identify and drive reengineering opportunities, and opportunities for adopting new technologies and methods. Perform peer code review and participate in technical discussions with the team on the best solutions possible Minimum Qualifications: BS or MS degree in computer science, computer engineering, or other technical discipline, or equivalent work experience. 5+ years of hands-on software development experience with Big Data & Analytics solutions
Hadoop Hive, Spark, Python, shell scripting, GCP Cloud - Big Query, Airflow, DataProc, PubSub
Strong experience in designing, developing, and optimizing data pipelines for large-scale data processing, transformation, and analysis using Big Data and GCP technologies. Proficiency in SQL and database systems, with experience in designing and optimizing data models for performance and scalability. Experience of building event processing pipelines with Kafka or GCP PubSub. Design and development experience with Airflow, PubSub Kafka, Git, Jenkins is desirable. Knowledge of distributed (multi-tiered) systems, algorithms & relational databases. Strong Object-Oriented Programming skills and design patterns. Experience with CICD pipelines, Automated test frameworks, and source code management tools (XLR, Jenkins, Git). Good knowledge and experience with configuration management tools like GitHub Ability to analyze complex data engineering problems, propose effective solutions, and implement them effectively. Ability and willingness to learn, adopt and build solutions using the Enterprise frameworks established for Big Data Looks proactively beyond the obvious for continuous improvement opportunities. Communicates effectively with product and cross functional team. Willingness to learn new technologies and leverage them to their optimal potential. Understanding of various SDLC methodologies, familiarity with Agile & scrum ceremonies. Certifications in cloud platform (GCP Professional Data Engineer) is a plus.
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.