Databricks Engineer
- Irving, Texas, United States
- Irving, Texas, United States
Über
Job Title: Databricks Engineer
Green card/ Citizen Only
Location: Onsite – Irving, TX
Rate: $60/hr (C2C/W2)
Job Overview
We are seeking a highly skilled Databricks Engineer with strong experience in Databricks development, administration, and support. The ideal candidate will have a solid background in data engineering, PySpark, and Delta Lake, along with hands-on experience managing Databricks on Google Cloud Platform (GCP). This role combines engineering, operational support, and platform optimization responsibilities to ensure efficient and scalable data solutions.
Key Responsibilities
Databricks Administration
Install, configure, and manage Databricks workspaces on GCP Cloud environments.
- Monitor platform performance, resource utilization, and costs to ensure optimal efficiency.
- Implement and maintain governance, security, logging, and auditing frameworks.
Manage user roles, cluster policies, and workspace configurations for multi-team environments.
Development & Enhancements
Design, build, and optimize ETL/ELT pipelines using PySpark, SQL, and Delta Lake.
- Collaborate with data architects, analysts, and business stakeholders to deliver scalable data models.
- Migrate and modernize legacy data pipelines into Databricks-native architectures.
Develop reusable data frameworks, notebooks, and modules following best coding practices.
Operations, Support & Troubleshooting
Provide L2/L3 Databricks support, resolving performance issues, job failures, and cluster errors.
- Diagnose root causes for production incidents and implement permanent fixes.
- Manage job scheduling, task orchestration, and integration with tools like Airflow or DataProc.
- Maintain detailed technical documentation for development, operations, and platform standards.
Required Skills & Experience
- 8+ years of total data engineering experience, with 3+ years dedicated to Databricks.
- Strong proficiency in PySpark, SQL, and Delta Lake for large-scale data processing.
- Experience with GCP Dataproc and cloud-native data services.
- Proven ability to administer, monitor, and optimize Databricks clusters and jobs.
- Familiarity with data governance, security, and CI/CD for data pipelines.
- Strong analytical and problem-solving skills with attention to detail.
- Excellent communication and documentation skills.
Preferred Qualifications
- Experience integrating Databricks with GCS, BigQuery, Airflow, or Terraform.
- Knowledge of data warehousing concepts, data modeling, and data lakehouse architectures.
- Certifications in Databricks, GCP, or related technologies are a plus.
Job Type: Contract
Pay: $60.00 per hour
Work Location: In person
Sprachkenntnisse
- English
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.