Data & Analytics Engineer
REKOM Group
- Milton Keynes, England, United Kingdom
- Milton Keynes, England, United Kingdom
Über
platform and pipeline architecture to data modelling. You will act as the technical counterpart to the Business Systems team, ensuring that they have clean, governed and structured data to drive reporting and visualisation throughout the business. This is your chance to gain international exposure and lead a pivotal Group-wide data transformation initiative. Key Objectives & Responsibilities Strategy & Platform Architecture Platform Ownership & Infrastructure:
Own the end-to-end design and implementation of our unified cloud data platform. Utilise Infrastructure as Code to provision and manage resources, ensuring a scalable, secure and reproducible environment. Data Strategy & Roadmap:
Help re-define the roadmap for data maturity, transitioning the business from various manual workflows to proactive, automated intelligence for tactical reporting and data driven decision making. Architect a modern multi-layered data structure to serve as the blueprint for the data transformation. Cost & Resource Management:
Along with the Head of Network and Infrastructure Services, proactively manage cloud compute and storage resources. Implement distinct warehouses / clusters, auto-scaling policies and budget alerts to maximise performance efficiency while preventing unchecked spend. Governance & Compliance:
Establish a unified governance framework. Implement a centralised data catalogue, strict role-based access control, row / column-level security and data retention policies to ensure full data compliance and internal standards. Data Integration, Engineering & DevOps Silo Elimination & Data Ingestion:
Identify isolated data pockets across the business and unify them into the central platform. Design robust ELT / ETL pipelines to ingest high-volume data from internal systems, APIs and flat files. Modernisation & Migration:
Lead the technical migration of the core reporting and data assets from our existing Data Warehouses to the new modern stack, ensuring data integrity, history preservation and zero downtime for business users. Process Automation:
Eliminate manual interventions in reporting and transformation. Implement continuous ingestion / delivery pipelines to automate testing, version control and code deployment. Pipeline Orchestration:
Architect reliable data workflows (using platform-native schedulers) to manage complex dependencies and ensure timely data delivery. Data Modelling & Quality Semantic Layer Design:
Partner with the Business Systems team to design optimised data models, ensuring seamless integration with downstream BI tools. Business Logic Standardisation:
Implement business rules within the transformation layer. Ensure consistent definitions of critical KPIs across all departments to create a Single Source of Truth. Data Quality & Optimisation:
Implement automated data quality checks to prevent bad data from entering the serving layer. Continuously optimise table structures for query performance. Business Expansion & Scalability Cross-Functional Support:
Expand the platforms capabilities beyond core Finance / Operations to support specialised needs in our Digital, Commercial and Trade Marketing departments, ensuring the architecture supports diverse data types and use cases. Group Integration Strategy:
Lead the data integration strategy for Group related Companies. Architect the platform to securely handle multi-tenant or multi-entity data requirements, ensuring separation of concerns while enabling consolidated Group reporting. Technical Skills & Qualifications To succeed in this role, you will need a strong blend of architectural vision and hands-on engineering capability. We are looking for a practitioner who is comfortable building from the ground up using modern cloud-native technologies. Core Languages:
Strong proficiency in SQL for complex querying and data modelling, and strong proficiency in Python for scripting, API integration, and data automation. Cloud Data Platforms:
Proven experience architecting and managing modern Cloud Data Warehouses (e.g. Snowflake, Databricks or Azure Synapse). You must understand compute / storage separation and cost optimisation. Data Engineering & ETL/ELT:
Experience building robust data pipelines using modern tools (e.g. Fivetran, CData, Airbyte or Azure Data Factory). Data Transformation:
Experience building production data pipelines using SQL-based transformation frameworks (e.g. dbt or custom Python/SQL frameworks). Data Modelling:
Knowledge of dimensional modelling concepts and experience designing semantic layers for BI tools. Infrastructure as Code (IaC):
Hands-on experience provisioning and managing cloud infrastructure using IaC tools such as Terraform. DevOps & CI/CD:
Proficiency with version control and setting up CI/CD pipelines to automate testing and deployment.
TPBN1_UKTJ
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.