Dieses Stellenangebot ist nicht mehr verfügbar
Über
OneMarketData is continuously searching for bright talent with the skills to make an impact. From developers to data scientists, at OneTick, you will have the opportunity to develop and enhance your problem-solving skills using a combination of analytics, imagination, and talent. We're a leading RegTech company, delivering advanced market and trade surveillance solutions to some of the world's top financial institutions including regulators, exchanges, and brokers. Our flagship product, OneTick Trade Surveillance, was named Best Trade Surveillance Solution at the TradingTech Insight Awards USA 2025, and is trusted to detect complex forms of market manipulation. Every day, our systems process and analyze massive volumes of market data - helping firms stay compliant with global regulatory standards like MiFID II, SEC, FINRA, ASIC, and IIROC. We're looking for a Data Engineer to help us build and maintain reliable, scalable data workflows that power our trade surveillance platform. You'll thrive in this role if you're someone who: Are a curious and adaptable engineer who enjoys working with complex data flows, transforming large datasets, and adapting systems to real-world business needs Want to deepen your skills and further develop as a Data Engineer or Python developer Have a genuine interest in the investing and trading domain and are excited about gaining international experience and career opportunities in a high-impact field Tech stack: AWS (EFS, EC2, SGW, ASG, NLB, and more) Terraform & Ansible for infrastructure as code Kubernetes (K8s) and Docker Grafana & Sentry for observability (metrics, logs, alerts) OneTick time-series database Apache Airflow for data-processing pipelines Python 3 as the primary programming and scripting language What you'll do: Build, maintain, and customize ETL pipelines that process large volumes of data on a daily basis Investigate data processing issues and resolve them Collaborate directly with clients to gather requirements, provide updates, and ensure successful delivery of data workflows Requirements: Linux: confident with the command line and system fundamentals GitLab / Git: working knowledge of Git, understanding of core branching strategies (Git Flow, GitHub Flow, GitLab Flow) and main git commands Python: hands-on experience with Python and main tools for data manipulation: Pandas, Numpy, working with datetimes, string transformations English: professional working proficiency (spoken and written) Will be considered an advantage: Kubernetes & Docker: ability to explore a cluster, inspect pods/containers Experience working with Apache Airflow (especially for managing data processing pipelines) Basic knowledge of financial markets and trading workflows: common asset classes (e.g., equities, derivatives, fixed income), trade lifecycle events, and corporate actions What we offer: Flexible working arrangements: full remote, hybrid, or office-based, depending on your location and preferences Competitive compensation, aligned with your experience, skillset, education, and local market standards Regular performance reviews linked to salary adjustments Medical insurance for you and immediate family members Professional-development budget to support courses, certifications, and conferences Supportive international team The position will require a background check, signed NDA, signed contract, and signed GDPR processor passthrough agreement (since we act as a data processor under GDPR). Salaries will be commensurate with experience, education, skillset, and local norms.
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.