XX
Data Engineer - Remote / TelecommuteCynet SystemsUnited States
XX

Data Engineer - Remote / Telecommute

Cynet Systems
  • US
    United States
  • US
    United States
Apply Now

About

Job Description:
The Data Engineer is responsible for building, optimizing, and maintaining data pipelines, ensuring high data quality, strong observability, and reliable platform performance. This role supports modern data infrastructure, migration efforts, automation workflows, and cross-team collaboration to ensure scalable and efficient data operations. Requirement/Must Have Advanced SQL skills for table management, deprecation, and data querying. Python scripting for automation, ETL workflows, and alert tooling. Experience with Airflow including DAG creation, dependency management, and alert configuration. Strong version control and CI/CD experience, including Git and deployment pipelines. Ability to configure alerts and reduce false positives using observability tools such as Monte Carlo. Experience integrating observability tooling with orchestration and monitoring platforms. Ability to perform root cause analysis on alert triggers and pipeline issues. Experience supporting legacy-to-modern data platform migrations. Ability to debug upstream dependencies and external data source failures. Experience documenting and handing off pipelines. Ability to create and maintain technical documentation including runbooks and alert resolution guides. Experience collaborating with cross-functional teams. Understanding of data governance and accountability models. Experience with table deprecation, cleanup, and alert consolidation.
Experience
Relevant data engineering experience supporting large-scale data workflows, pipelines, and monitoring. Experience working with Snowflake, Databricks, Salesforce, Airflow, or equivalent tools.
Responsibilities
Build and maintain automated data pipelines and workflows. Manage data quality alerts, reduce noise, and resolve pipeline issues. Support migration of legacy data systems to modern platforms. Analyze and resolve upstream and dependency-related issues. Maintain clean, well-documented, and efficient data infrastructure. Collaborate with engineering and analytics teams to ensure alignment and data quality. Create technical documentation including wikis, runbooks, and process documentation. Consolidate and optimize alerts and monitoring configurations.
Skills
Advanced SQL and Python. Airflow DAG development and orchestration. Observability and monitoring tools. Troubleshooting and debugging skills. Strong technical communication and documentation ability.
Qualification And Education
Relevant education or equivalent practical experience in data engineering, software engineering, or a related technical field.
#J-18808-Ljbffr
  • United States

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.