XX
Senior Data EngineerremoterocketshipRemote, Oregon, United States

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Senior Data Engineer

remoterocketship
  • US
    Remote, Oregon, United States
  • US
    Remote, Oregon, United States

Über

Job Description:
Design, develop, and maintain a scalable lakehouse architecture, including a medallion (bronze/silver/gold) data model optimized for analytics andAI/ML consumption. Design, implement, and operate ELT pipelines, including workflow orchestration, scheduling, and monitoring, to ensure reliable and scalable execution. Establish data quality, testing, and observability practices, and proactively monitor and resolve data and automation issues to ensure platform reliability and trust. Ensure data security and compliance, including role-based access controls for security, encryption, masking, and governance best practices to ensure compliant handling of sensitive information. Optimize performance of data workflows and storage for cost efficiency and speed. Partner with engineers, analysts, and stakeholders to meet data needs; balance cost, performance, simplicity, and time-to-value while mentoring teams and documenting standards. Provide technical leadership and mentorship to team members – guiding best practices, skill development, and collaboration cross-functionally. Enable AI/ML use cases through well-structured data models, feature availability, and platform integrations using tools such as Databricks Vector Search and Model Serving. Develop and maintain data pipelines using version control and CI/CD best practices in a collaborative engineering environment. Collaborate within an Agile-Scrum framework and develop comprehensive technical design documentation to ensure efficient and successful delivery. Serve as a trusted expert on organizational data domains, processes, and best practices. Requirements:
5+ years of hands-on data engineering experience required 3+ years of experience building and operating data pipelines on a modern lakehouse platform (e.g., Databricks – Unity Catalog, Delta Live Tables, Asset Bundles), including data modeling, governance, and CI/CD deployment patterns 3+ years of experience with analytical SQL (ANSI SQL/T-SQL/Spark SQL) and Python for data engineering, including pipeline construction, transformation logic, and automation required Strong communication skills with the ability to collaborate and influence across engineering, analytics, and business stakeholders required Streaming and ingestion tools, such as Kafka, Kinesis, Event Hubs, Debezium, or Fivetran preferred DAX, LookML, dbt; Airflow/Dagster/Prefect, Terraform; Azure DevOps; Power BI/Looker/Tableau; GitHub CoPilot knowledge is a plus Bachelor’s degree in Computer Science, Information Technology, or a related field. Master’s degree preferred. Benefits:
Preference given to East Coast candidates.
  • Remote, Oregon, United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.