XX
Python Data EngineerData Science FestivalLondon, England, United Kingdom

This job offer is no longer available

XX

Python Data Engineer

Data Science Festival
  • GB
    London, England, United Kingdom
  • GB
    London, England, United Kingdom

About

Python Data Engineer Contract – £500 – £600pd Location: London – 2 days per week We are currently looking for an Data Engineer to join our fast-paced, data-driven tech team within a global digital media environment. You’ll play a crucial role in shaping how the business collects, models, and activates data across multiple commercial and editorial functions. In this hands‑on role, you will architect and build scalable data pipelines, optimise infrastructure, and deliver high‑value insight tools that empower decision makers. The Data Engineer will work closely with commercial, product, and analytics teams, making this position vital to the organisation’s long‑term data strategy.
The Opportunity As an Data Engineer, you’ll combine engineering, analytics, and product thinking to create a reliable, high-performing data ecosystem. You’ll have the opportunity to work with modern cloud technologies, large-scale datasets, and a business that truly values data as a strategic asset.
Key Responsibilities
Building, operating, and optimising end‑to‑end ETL/ELT data pipelines using APIs, SFTP, and containerised orchestration tools.
Developing scalable and well‑structured data models that support commercial, programmatic, and affiliate revenue functions.
Managing and improving complex data infrastructure that processes high‑volume, multi‑source Big Data.
Creating, maintaining, and enhancing interactive dashboards that drive KPI‑focused decision‑making.
Owning data quality, ensuring accuracy, consistency, and reliability across all core datasets.
Analysing campaign, monetisation, and platform performance and providing actionable insights.
Collaborating with Operations, Sales, Marketing, Finance, and Senior Analytics teams.
Supporting strategic projects with advanced data modelling and insight generation.
This role stands out because it sits at the intersection of engineering and commercial impact, your work directly supports revenue optimisation, global reporting, and critical business decisions.
Skills and Experience
Strong Python and/or PySpark
Experience with cloud technologies such as GCP (BigQuery, Compute Engine, Kubernetes) and AWS (Redshift, EC2).
Experience building ETL/ELT pipelines and working with APIs or SFTP integrations.
Understanding of data modelling, warehousing, and Big Data environments.
Strong analytical and creative problem‑solving skills.
Ability to manage projects and collaborate effectively in a team.
Experience creating util packages in Python
If you would like to be considered for the role and feel you would be an ideal fit with our team then please send your CV to us by clicking on the Apply button below.
#J-18808-Ljbffr
  • London, England, United Kingdom

Languages

  • English
Notice for Users

This job was posted by one of our partners. You can view the original job source here.