Dieses Stellenangebot ist nicht mehr verfügbar
Über
Scroll down to find an indepth overview of this job, and what is expected of candidates Make an application by clicking on the Apply button.
Locations: Nenagh / Dublin (Ireland) or Warsaw (Poland)
Work Mode: Hybrid (3 Days from Office)
Contract Role: 6 months
Start Date: Only Immediate Joiners or candidates with max 2-3 weeks’ notice
Role Overview
We are seeking an experienced Data Platform Solution Architect to design, build, and optimize modern, cloud-native data platforms. You will play a key role in shaping scalable, secure, and high-performance data architectures, leveraging best-in-class technologies across data engineering, cloud infrastructure, and DevOps.
This role requires a strong blend of hands‑on technical expertise, architectural leadership, and strategic thinking to deliver robust data solutions aligned with business needs.
Key Responsibilities
Design and deliver end-to-end data platform architectures using modern cloud-native patterns
Produce high-quality Architecture Design Documents (ADDs) and technical blueprints
Architect and optimize solutions using Snowflake, dbt, and Data Lakehouse (Iceberg)
Build and manage scalable S3-based data lakes and processing pipelines using EMR / PySpark
Develop and orchestrate workflows using Airflow / MWAA
Lead data ingestion, transformation and modeling strategies
Ensure platform reliability, scalability and performance tuning across systems
Implement observability, monitoring and APM solutions
Drive CI/CD pipelines, automation and Infrastructure as Code (Terraform)
Apply security best practices and ensure compliance across data platforms
Integrate data cataloging and governance tools
Collaborate with cross-functional teams including engineering, DevOps, and business stakeholders
Required Skills & Experience
Core Architecture & Data Platforms
Strong experience in Solution Architecture and enterprise data platform design
Deep expertise in Snowflake architecture and performance tuning
Experience with Data Lakehouse architectures (Apache Iceberg)
Hands‑on with AWS S3 data lakes and EMR / PySpark processing
Data Engineering & Orchestration
Proficiency in Apache Airflow / AWS MWAA
Strong experience with dbt for data transformation
Solid understanding of data ingestion pipelines and data modeling
Cloud & DevOps
Strong knowledge of Terraform / Infrastructure as Code (IaC)
Hands‑on experience with CI/CD pipelines and DevOps practices
Experience implementing automation and deployment strategies
Performance, Reliability xcfaprz & Observability
Expertise in performance tuning (Snowflake, Airflow, Iceberg)
Strong focus on platform reliability and scalability
Experience with monitoring, logging and APM tools
Security & Governance
Knowledge of cloud security best practices
Experience with data governance and catalog tools
Nice to Have
Experience with multi‑cloud environments
Exposure to real‑time/streaming data architectures
Certifications in cloud platforms (AWS, etc.)
Experience working in regulated environments
What We’re Looking For
Strong problem‑solving and analytical mindset
Ability to translate business requirements into technical solutions
Excellent communication and stakeholder management skills
Passion for building modern, scalable data platforms
#J-18808-Ljbffr
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.