XX
Senior Data EngineerSaxon GlobalPhoenix, Arizona, United States

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Senior Data Engineer

Saxon Global
  • US
    Phoenix, Arizona, United States
  • US
    Phoenix, Arizona, United States

Über

The Senior Data Engineer & Test in Phoenix 85029 will play a pivotal role in delivering major data engineering initiatives within the Data & Advanced Analytics space. This position requires hands-on expertise in building, deploying, and maintaining robust data pipelines using Python, PySpark, and Airflow, as well as designing and implementing CI/CD processes for data engineering projects

Key Responsibilities

  1. Data Engineering: Design, develop, and optimize scalable data pipelines using Python and PySpark for batch and streaming workloads.

  2. Workflow Orchestration: Build, schedule, and monitor complex workflows using Airflow, ensuring reliability and maintainability.

  3. CI/CD Pipeline Development: Architect and implement CI/CD pipelines for data engineering projects using GitHub, Docker, and cloud-native solutions.

  4. Testing & Quality: Apply test-driven development (TDD) practices and automate unit/integration tests for data pipelines.

  5. Secure Development: Implement secure coding best practices and design patterns throughout the development lifecycle.

  6. Collaboration: Work closely with Data Architects, QA teams, and business stakeholders to translate requirements into technical solutions.

  7. Documentation: Create and maintain technical documentation, including process/data flow diagrams and system design artifacts.

  8. Mentorship: Lead and mentor junior engineers, providing guidance on coding, testing, and deployment best practices.

  9. Troubleshooting: Analyze and resolve technical issues across the data stack, including pipeline failures and performance bottlenecks.

Cross-Team Knowledge Sharing: Cross-train team members outside the project team (e.g., operations support) for full knowledge coverage. Includes all above skills, plus the following;

· Minimum of 10+ years overall IT experience

· Experienced in waterfall, iterative, and agile methodologies

Technical Requirment:

  1. Hands-on Data Engineering : Minimum 5+ yearsof practical experience building production-grade data pipelines using Python and PySpark.

  2. Airflow Expertise: Proven track record of designing, deploying, and managing Airflow DAGs in enterprise environments.

  3. CI/CD for Data Projects : Ability to build and maintain CI/CD pipelinesfor data engineering workflows, including automated testing and deployment**.

  4. Cloud & Containers: Experience with containerization (Docker and cloud platforms (GCP) for data engineering workloads. Appreciation for twelve-factor design principles

  5. Python Fluency : Ability to write object-oriented Python code manage dependencies, and follow industry best practices

  6. Version Control: Proficiency with **Git** for source code management and collaboration (commits, branching, merging, GitHub/GitLab workflows).

  7. Unix/Linux: Strong command-line skills** in Unix-like environments.

  8. SQL : Solid understanding of SQL for data ingestion and analysis.

  9. Collaborative Development : Comfortable with code reviews, pair programming and usingremote collaboration tools effectively.

  10. Engineering Mindset: Writes code with an eye for maintainability and testability; excited to build production-grade software

  11. Education: Bachelor's or graduate degree in Computer Science, Data Analytics or related field, or equivalent work experience.

  • Phoenix, Arizona, United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.