Senior Databricks Engineer
remoterocketship
- New York, New York, United States
- New York, New York, United States
À propos
Design and build large-scale data platforms on Databricks (Delta Lake, Spark, Unity Catalog) in Azure Develop and maintain batch and streaming data pipelines for high-volume, complex data sources Implement medallion/lakehouse architectures from the ground up in greenfield environments Build and optimize data models to support analytics, reporting, and downstream applications Integrate Databricks with enterprise systems (APIs, event streams, warehouses, ML workflows) Tune Spark jobs and pipelines for performance, reliability, and cost at scale Support production deployments, including CI/CD pipelines, testing, and release management Partner directly with enterprise clients to translate requirements into working technical solutions Collaborate with architects, engineers, and data scientists across multiple workstreams Balance speed and quality, knowing when to move fast and when to harden solutions Make pragmatic decisions in ambiguous, evolving environments (especially greenfield builds) Contribute hands-on while also guiding design and approach across the team Communicate tradeoffs clearly to both technical and non-technical stakeholders Work within modern engineering practices (version control, code reviews, automated testing) Demonstrated ability to mentor and guide data engineers and analysts Requirements:
Deep Databricks-native expertise, including experience architecting and implementing end-to-end lakehouse solutions that run primarily or entirely on Databricks. Advanced experience with modern Databricks architecture patterns, including declarative pipelines / Delta Live Tables, Unity Catalog, Delta Lake, workflow orchestration, governance, performance tuning, and operational monitoring. Familiarity with infrastructure-as-code (Terraform, Bicep), environment provisioning, and CI/CD automation (Github, Azure DevOps) for Databricks-based platforms. Strong learning agility, technical curiosity, and comfort using AI-enabled development workflows or automation tools to accelerate delivery and improve quality. Familiarity with other modern cloud data architectures and tools, including cloud-native data warehouses (Snowflake, BigQuery, Redshift), data lakes, orchestration frameworks (Airflow/Astronomer), transformation tools (dbt), catalog/governance platforms, and scalable batch or streaming data processing services (Kafka, Kinesis). Benefits:
medical dental vision 401k holiday pay vacation personal and family sick leave and more.
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.