XX
Databricks Senior EngineerSlalomMichigan, North Dakota, United States

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Databricks Senior Engineer

Slalom
  • US
    Michigan, North Dakota, United States
  • US
    Michigan, North Dakota, United States

Über

Who You'll Work With
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value.

At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.

What You'll Do

  • Be a technical leader in teams with minimal oversight and direction to deliver innovative solutions on the Databricks Platform using core cloud data lakehouse methodologies, tools, distributed processing engines, event streaming platforms, and other modern data related technologies.
  • Be part of the Databricks Center of Excellence.
  • Build the next generation of data platforms and work with some of the most forward-thinking organizations in data and analytics.
  • Work under the direction of a Solution Architect to help design and implement components of our clients' data platform solution.
  • Participate in design sessions and be able to break down complex development tasks, and complete development items on time.
  • Contribute to various COE initiatives to develop Databricks solution accelerators and bring an innovative mindset.

What You'll Bring
As a Senior Engineer in the Databricks practice, you will bring a curious mindset and a passion for exploring innovative solutions to address our clients' most pressing data challenges. You are a self-starter who excels at breaking down complex problems and eagerly shares insights with your team and the broader Builder community.

Key Responsibilities

  • Develop and implement data solutions using Databricks, with hands-on experience in specific Databricks platform features such as Delta Lake, Uniform (Iceberg), Delta Live Tables (Lakeflow Declarative Pipelines), and Unity Catalog.
  • Collaborate with cross-functional teams to design and optimize data pipelines for both batch and streaming data, ensuring data quality and efficiency.
  • Stay updated with emerging technologies, latest Databricks platform features and continuously improve your skills in Databricks and other relevant tools.

Requirements

  • 5+ years of data engineering experience, with at least 2 years of hands-on data pipeline design and development experience with Databricks, including specific -platform features like Delta Lake, Uniform (Iceberg), Delta Live Tables (Lakeflow Declarative pipelines), and Unity Catalog.
  • Proficiency in designing and building robust, scalable, YAML configuration driven data pipelines with batch, micro-batch, and streaming data ingestion and processing patterns using tools like Auto Loader demonstrating a strong understanding of modern data engineering practices.Experienced in building complex job/workflow orchestration patterns using either Databricks jobs/workflows or with other orchestration tools like Airflow, dbt etc.
  • Exposure in building robust data quality, pipeline audit/observability solutions with Databricks native features and/or other data quality tools/frameworks like Great Expectations, Collibra, dbt etc.
  • Proficiency in Big Data Platforms: Apache Spark, Presto, Amazon EMR
  • Experience with Cloud Data Warehouses: Amazon Redshift, Snowflake, Google BigQuery.
  • Strong programming skills using SQL, Stored Proceduresand Object-Oriented Programming languages (like Java, Python, PySpark etc.).
  • Familiarity with building DevOPs, CI/CD pipelines using Databricks Asset Bundles with automated validation and testing.
  • Exposure with Infrastructure as Code (IaC) tools like Terraform is a big plus.
  • Familiarity with NoSQL Databases and Container Management Systems.
  • Exposure to AI/ML tools (like mlflow), prompt engineering,, and modern data and AI agentic workflows.
  • An ideal candidate will have Databricks Data Engineering Associate and/or Professional ceritification completed with multiple Databricks project delivery experience.

About Us
Slalom is a fiercely human business and technology consulting company that leads with outcomes to bring more value, in all ways, always. From strategy through delivery, our agile teams across 52 offices in 12 countries collaborate with clients to bring powerful customer experiences, innovative ways of working, and new products and services to life. We are trusted by leaders across the Global 1000, many successful enterprise and mid-market companies, and 500+ public sector organizations to improve operations, drive growth, and create value. At Slalom, we believe that together, we can move faster, dream bigger, and build better tomorrows for all.

Compensation And Benefits
Slalom prides itself on helping team members thrive in their work and life. As a result, Slalom is proud to invest in benefits that include meaningful time off and paid holidays, parental leave, 401(k) with a match, a range of choices for highly subsidized health, dental, & vision coverage, adoption and fertility assistance, and short/long-term disability. We also offer yearly $350 reimbursement

  • Michigan, North Dakota, United States

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.