This job offer is no longer available
About
This is a 3-month contract opportunity at 25 hours per week. Responsibilities:
Pipeline Architecture: Design and implement declarative data pipelines using
Lakeflow and
Databricks Asset Bundles (DABs)to ensure seamless CI/CD.
Data Ingestion: Build efficient, scalable ingestion patterns using
AutoLoader and
Change Data Capture (CDC)to handle high-volume data streams.
Governance & Security: Manage metadata, lineage, and access control through
Unity Catalog
.
Orchestration: Develop and maintain complex workflows using Databricks Jobs and orchestration tools.
Infrastructure as Code: (Asset) Utilize
Terraform to manage AWS resources (S3, EC2) and Databricks workspaces.
Qualifications:
Expertise: Deep mastery of
PySpark and advanced
SQL
.
Platform: Extensive experience in the Databricks environment (Workflows, Delta Lake).
Cloud: Familiarity with
AWS infrastructure and cloud-native data patterns.
Arctiq is an equal opportunity employer. If you need any accommodations or adjustments throughout the interview process and beyond, please let us know. We celebrate our inclusive work environment and welcome members of all backgrounds and perspectives to apply.
We thank you for your interest in joining the Arctiq team! While we welcome all applicants, only those who are selected for an interview will be contacted.
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.