XX
Data EngineerremoteBaseUnited States
XX

Data Engineer

remoteBase
  • US
    United States
  • US
    United States
Jetzt Bewerben

Über

Data Pipeline Engineer
This role has a specialized focus on building and maintaining robust, scalable, and automated data pipelines and plays a key role in optimizing our data infrastructure and enabling efficient data delivery across the organization. As the organization enhances its cloud data platform (Snowflake or something similar), this role will be instrumental in implementing and managing CI/CD processes, infrastructure as code (Terraform), and data transformation workflows (dbt). Job Responsibilities
Design, build, and maintain scalable and resilient CI/CD pipelines for data applications and infrastructure, with a focus on Snowflake, dbt, and related data tools. Implement and manage Snowflake dbt projects for data transformation, including developing dbt models, tests, and documentation, and integrating dbt into CI/CD workflows. Develop and manage infrastructure as code (IaC) using Terraform to provision and configure cloud resources for data storage, processing, and analytics on GCP. Automate the deployment, monitoring, and management of Snowflake data warehouse environments, ensuring optimal performance, security, and cost-effectiveness. Collaborate with data engineers and data scientists to understand their requirements and provide robust, automated solutions for data ingestion, processing, and delivery. Implement and manage monitoring, logging, and alerting systems for data pipelines and infrastructure to ensure high availability and proactive issue resolution. Develop and maintain robust automation scripts and tools, primarily using Python, to streamline operational tasks, manage data pipelines, and improve efficiency; Bash scripting for system-level tasks is also required. Ensure security best practices are implemented and maintained across the data infrastructure and pipelines. Troubleshoot and resolve issues related to data infrastructure, pipelines, and deployments in a timely manner. Participate in code reviews for infrastructure code, dbt models, and automation scripts. Document system architectures, configurations, and operational procedures. Stay current with emerging DevOps technologies, data engineering tools, and cloud best practices, particularly related to Snowflake, dbt, and Terraform. Optimize data pipelines for performance, scalability, and cost. Support and contribute to data governance and data quality initiatives from an operational perspective. Help implement AI features
  • United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.