XX
TRM International

Market Data Infrastructure Engineer– Lausanne

  • +2
  • +7
  • CH
    Lausanne, Vaud, Switzerland
Manifester de l'intérêt pour ce poste
  • +2
  • +7
  • CH
    Lausanne, Vaud, Switzerland

À propos

We are hiring on behalf of a leading and international institutional brokerage group, operating hi-touch, hybrid or fully electronic trading venues. Their customers are the largest investment banks, commercial banks, hedge funds, asset managers, as well as the most active corporations in the energy derivatives markets. They are also a vendor of market data. Researching on how new technologies can be embedded across their activities and drive their group toward a data-driven digital organization, they have established a Data Science Lab with a strong entrepreneurial spirit. This was recently complemented by the establishment of a Data Engineering Team, to develop new data capabilities. As part of the Data Platform Team, your mission will be to develop, deploy, and maintain various data pipelines. These pipelines will be key to serving data consumption scenarios such as business intelligence analytical tools, or many applications. To expand the team, they are looking for a data engineer who is proficient in Python and Docker, and has a sharp problem-solving mindset.

Objectives and Responsibilities
  1. Develop, deploy, and maintain various data pipelines originating from diverse data sources, serving data scientists, various systems, and many end users.
  2. Act as a technical referent to data scientists and business intelligence developers and conduct code reviews.
  3. Work closely with business IT teams, data scientists, and external providers, if the case may be, to deploy, maintain, and monitor data processing and machine learning pipelines.
  4. Develop and maintain advanced CI/CD pipelines to ensure continuous integration, testing, and deployment of the pipelines.
  5. Develop and maintain documentation of the data architecture and data management processes.
Required Skills and Qualifications
  1. Master’s degree in computer science, information technology, technology engineering or equivalent.
  2. Strong interest in the data infrastructure and data management landscapes and related topics.
  3. Experience in developing and deploying Python or Spark pipelines running on Docker.
  4. Experience with data storage solutions such as data lakes, data warehouses, relational databases.
  5. Experience with building CI/CD pipelines with Jenkins, Gitlab, or similar is a plus.
  6. Familiar with orchestrator tools such as Airflow.
  7. Knowledge of Linux, Python, and SQL.
  8. Ability to communicate and collaborate effectively within the team and with external stakeholders.
  9. Excellent spoken and written English.

Job Types: Full-time, Permanent

Pay: CHF112’000.00 per year

Work Location: In person

#J-18808-Ljbffr

Compétences idéales

  • Python
  • Docker
  • Spark
  • Jenkins
  • Gitlab
  • Linux
  • SQL
  • Lausanne, Vaud, Switzerland

Expérience professionnelle

  • Data Engineer
  • Data Infrastructure

Compétences linguistiques

  • English