XX
Senior Data EngineerRedlenCanada
XX

Senior Data Engineer

Redlen
  • CA
    Canada
  • CA
    Canada
Jetzt Bewerben

Über

Redlen's Data Science team is responsible for unlocking insights from the gigabit-per-second image data generated by our photon-counting CT (PCCT) detectors. This team plays a pivotal role in understanding, characterizing, and reporting on detector performance and failure modes across reliability experiments, production testing, and in-gantry operations. By leveraging advanced data engineering, machine learning, and software development principles, the team transforms raw detector data into actionable intelligence that drives design optimization, reliability improvements, and image quality enhancements. Through close collaboration with cross-functional partners, the Data Science team ensures that Redlen's detectors deliver world-class imaging outcomes and meet the rigorous demands of clinical and industrial applications.
About the RoleThe Senior Data Engineer will design, build, and maintain scalable data pipelines and automation frameworks that enable the Data Science team to process and analyze detector data efficiently. This role is critical to modernizing workflows and supporting Redlen's ambitious production scale-up, which targets a 20x increase in output. Working within a virtual Linux environment and leveraging containerized orchestration tools such as Apache Airflow, the Senior Data Engineer will automate scientific processes, streamline data ingestion from network drives, and ensure robust, reproducible workflows. By enabling high-throughput data processing and integration, this role empowers the team to deliver timely insights that improve detector reliability and optimize image quality.Key ResponsibilitiesData Pipeline Development & Automation
Design and implement ETL pipelines to process PCCT detector data and metadata from network drives.
Develop reusable Python-based automation frameworks for data ingestion, transformation, and integration.
Automate JMP workflows using JSL (JMP Scripting Language) and integrate them with Python.
Automate MATLAB scripts and incorporate them into centralized pipelines.
Refine and scale existing automation tools to handle a 20x increase in production volume.
Workflow Orchestration & Infrastructure
Build and maintain orchestration workflows using Apache Airflow on a virtual Linux server in a containerized environment.
Maintain version control and CI/CD practices for all developed tools and pipelines.
Data Quality, Storage & Performance
Optimize data storage and processing using efficient file formats such as Parquet.
Ensure data quality and integrity through validation, cleaning, and enrichment steps in ETL pipelines.
Implement performance tuning for large-scale sensor data processing in a non-cloud environment.
Manage metadata effectively, including designing structures for metadata storage and retrieval.
Governance & Compliance
Establish and enforce data governance practices, ensuring compliance with internal standards for data security, access control, and auditability.
Collaboration & Knowledge Sharing
Collaborate with detector scientists, product engineers, test technologists and more to understand manual workflows and convert them into automated, scheduled processes.
Document workflows and provide training/support for team members using the new tools.
Other
Other duties as assigned and relevant to the role.
QualificationsRequired Skills
Strong proficiency in Python for data engineering and automation.
Expertise in Apache Airflow for workflow orchestration.
Containerization experience (Docker) as a prerequisite for Airflow deployment.
Fluency in JSL (JMP Scripting Language) and Python integration within JMP.
Experience with MATLAB scripting for scientific workflows.
Solid understanding of ETL design, data validation, and data quality frameworks.
Knowledge of efficient file formats (e.g., Parquet) for performance optimization.
Familiarity with Linux environments, version control (Git), and CI/CD pipelines.
Strong problem-solving skills.
Excellent oral and written communication skills in English.
Outstanding interpersonal skills.
Required Experience
6+ years of experience in data engineering or a related role, with a focus on automation and workflow orchestration.)
Demonstrated ability to work in a dynamic environment as a member of a cross-functional team.
Proven track record of working with limited supervision and managing responsibilities proactively.
Experience handling challenging development projects with competing priorities and emerging/ ambiguous requirements.
Demonstrated positive attitude and self-motivation.
Interest in mentoring junior team members.
Nice-to-Have
Experience with SharePoint API integration.
Exposure to scientific data workflows and reliability engineering.
Education
Bachelor's or Master's degree in Computer Science, Data Engineering, or related field.
Salary Range: $115,000 - $128,000 CAD
 
 
  • Canada

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.