About
Location: san Francisco, CA or Dallas, TX or Hopkins, MN or Charlotte, NC or Atlanta, GA
Contract: 6 months to perm or 6 months contract
In office: 3 days a week onsite
As an Analytics platform engineer supporting Azure & AWS Databricks from a DevOps perspective requires a hybrid skill set spanning primarily cloud infrastructure management and deployment, automation and CI/CD practices and it is helpful if they a background or exposure to ELT and data engineering and Database Administration Concepts etc.
The resource will be responsible developing the foundation Databricks for use within U.S. Bank as an Analytics Platform Engineering team member where they will develop automation via terraform modules or other tooling to deploy Databricks Workspaces, along with other cloud resources, we are developing a platform ontop of a PaaS offering where you also need to implement a CICD solution in Gitlab.
Core Responsibilities
Implement CI/CD Pipelines: Design and maintain continuous integration and continuous deployment pipelines using tools like Gitlab, Azure DevOps or GitHub Actions for deploying data pipelines and infrastructure changes.
Infrastructure as Code (IaC): Provision and manage Azure Analytics services (Databricks, etc.) through code using tools such as Terraform, ARM templates, or Bicep to ensure consistency and repeatability.
Platform Operations: Deploy, manage, and optimize the performance and resource utilization of Azure data services, including monitoring and troubleshooting data pipeline failures and performance issues.
Automation & Scripting: Automate routine operational tasks and application deployments using scripting languages like Python, PowerShell, or Bash.
Security & Compliance: Implement security best practices, including identity and access management (IAM), data encryption, and compliance with data governance policies (e.g., GDPR) within the platform.
Collaboration: Work closely with data engineers, data scientists, and business analysts to translate data requirements into robust technical solutions and foster a DevOps culture within the organization.
Key Skills And Qualifications
Cloud Platform Expertise: Deep knowledge of Microsoft Azure and/or AWS services, specifically, Azure and or AWS Databricks, Exposure to Azure Synapse Analytics (SQL pools, Spark pools, pipelines), Azure Data Factory (pipelines, triggers, data flows), Azure Data Lake Storage (ADLS), and Azure Monitor
DevOps Tools & Methodologies:
CI/CD platforms: Expertise Gitlab or Azure Devops (Azure Pipelines, Boards, Repos) or GitHub Actions.
Version Control: Strong experience with Git and branching strategies.
IaC tools: Proficiency in Terraform, Bicep, or ARM templates.
Containerization: Experience with Docker and container orchestration (Azure Kubernetes Service - AKS) is highly beneficial.
Programming & Scripting Languages: Proficiency in Python, SQL (SQL, query optimization), and scripting languages like PowerShell or Bash for automation and data manipulation tasks.
Data Engineering Fundamentals: Basic understanding of Data engineering principals such as ETL/ELT processes, data modeling, data warehousing, and big data concepts.
Monitoring & Logging: Experience implementing monitoring, logging, and alerting solutions using tools like Azure Monitor, Log Analytics, and Application Insights.
Soft Skills: Excellent problem-solving, analytical, and communication skills to work effectively in fast-paced, agile environments.
Preferred Certifications: Microsoft Certified: Azure Data Engineer Associate (DP-203) or Microsoft Certified: Azure DevOps Engineer Expert (AZ-400) are highly valued.
#J-18808-Ljbffr
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.