XX
US_East | Platform Engineer - DevOps Specialist _L2ApolisUnited States

Cette offre d'emploi n'est plus disponible

XX

US_East | Platform Engineer - DevOps Specialist _L2

Apolis
  • US
    United States
  • US
    United States

À propos

Role name: Databricks Administrator Work site: Seattle, WA - onsite.
Job Description: Databricks Administrator
Responsible for provisioning Databricks and Unity Catalog services while taking care of organizational policies and governance requirements as well as streamlining and automating data-related processes.
Skills and Experience (Hands-on)
Required • Account and Workspace Administration • Configuration of workspaces across environments • Management and optimization of compute resources • Strong knowledge of security and governance: user and group management, provisioning, identify federation • Role based access control, cluster policies and workspace object permissions • Unity Catalog administration of metastores, catalogs, schemas, external locations and Delta sharing) • Knowledge of security compliance procedures • Strong engineering knowledge to guide data engineering teams on ways to build pipelines and optimize existing ones • Usage monitoring and troubleshooting bottlenecks with spark jobs, ETL pipelines and ML workloads • Scripting experience in Python and SQL • Data sharing knowledge and implementing guardrails Preferred • Understanding of Agentic Architecture • Familiarity with data requirements of common ML/AI use cases
Responsibilities • Implement data provisioning patterns based on business requirements, following follow predefined processes, policies, standards, and metadata management rules • Create and manage distributed workspaces in Databricks, set up workspace policies, provision Databricks clusters and manage data infrastructure sizing and capacity • Create Python notebooks, implement data masking processes, create UDFs (SQL/Python), troubleshoot data pipelines • Ensure data security and compliance with regulations using Databricks and Privacera's features • Navigate multi-step enterprise approval process across architecture, security, and governance teams • Design and implement data architecture leveraging technologies such as Databricks, Unity Catalog, Privacera, and Collibra • Develop, optimize, and manage data pipelines for ETL processes using Databricks, with a focus on data integrity and quality • Design and maintain data models and schemas, incorporating Unity Catalog and Collibra data governance practices • Operationalize Machine Learning models in Batch and Real Time Data Pipelines, leveraging relevant governance setups • Collaborate with cross-functional teams including data scientists, engineers, and analysts to translate business requirements into scalable solutions
  • United States

Compétences linguistiques

  • English
Avis aux utilisateurs

Cette offre a été publiée par l’un de nos partenaires. Vous pouvez consulter l’offre originale ici.