About
Design and lead end-to-end data architecture solutions using Databricks
Own and execute data migration strategies into Databricks from legacy or cloud data platforms
Architect scalable Lakehouse solutions using Delta Lake, Spark, and cloud-native services
Define best practices for data modeling, ingestion, processing, and governance
Collaborate with data engineers, analytics teams, and stakeholders to translate business requirements into technical designs
Optimize performance, reliability, and cost of Databricks workloads
Ensure data security, compliance, and access controls within Databricks environments
Review and guide implementation through design reviews and technical governance
Required Skills & Experience
Several years of experience as a Data Architect / Databricks Architect
Strong hands-on experience with Databricks (Spark, Delta Lake, Lakehouse architecture)
Proven track record of migrating data platforms into Databricks
Solid background in data engineering prior to Databricks adoption
Deep understanding of distributed systems and big data architectures
Experience with cloud platforms (AWS / Azure / GCP)
Strong expertise in ETL/ELT pipelines, data modeling, and performance tuning
Excellent problem-solving and stakeholder communication skills
Good to Have
Databricks Certifications (Architect / Data Engineer / Professional level)
Experience with governance tools (Unity Catalog, data lineage, metadata management)
Exposure to streaming frameworks and real-time data processing
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.