About
We are seeking a mid-level to senior
DevOps/Platform Engineer
to join our team on a contract basis to help design, build, and operate a secure, scalable enterprise Data Private Cloud (DPC) platform. This is a hybrid role that blends container-platform development (Kubernetes/OpenShift and data services) with security operations (SecOps) and automation. You will build and enhance platform services and workflows across the SDLC, implement security controls and compliance guardrails, and partner with cross-functional teams to operationalize secure-by-default data services at scale.
Required Skills & Qualifications 5 years of Specialty Software Engineering experience, or equivalent demonstrated through one or a combination of the following: work or consulting experience, training, military experience, education. Strong Python development skills for enterprise-scale automation and service development. Solid understanding of security fundamentals (least privilege, defense-in-depth, secure SDLC) and common compliance concepts. Experience building or operating software in containerized environments (Kubernetes or OpenShift/OCP). Practical experience with CI/CD pipelines and integrating security checks/controls into delivery workflows. Strong communication skills and ability to work effectively across engineering and security stakeholders. Prior work experience at client or in client's Industry. Applicants must be able to work directly for Artech on W2.
Preferred Skills & Qualifications
Experience with Apache/open-source ecosystem tools such as Ranger, Keycloak, Spark, Iceberg, DataHub. Knowledge of S3-compatible object storage and large-scale distributed data processing patterns. Familiarity with observability tooling (logs/metrics/traces), security telemetry, and operational health dashboards. Experience with incident response, post-incident reviews, and improving operational resilience. Exposure to API design and/or UI development (e.g., React.js) for operational portals or admin tools. Day-to-Day Responsibilities
Design and build automated platform workflows for provisioning, deployment, and operational support of data services running on OpenShift/Kubernetes. Develop and maintain platform capabilities supporting data ecosystem components such as Spark, Iceberg, Ranger, Sparkflow, Superset, and related services. Contribute to resilient, scalable architecture for containerized workloads and large-scale data processing pipelines. Improve platform reliability through automation, runbooks, SRE practices, and standard operating procedures. Engineer security automation that enforces controls for data access, encryption, masking, and protection across the data platform. Integrate security into the SDLC by embedding controls into CI/CD pipelines, infrastructure-as-code, and release processes. Support security monitoring and compliance by contributing to policy management, attestation evidence, and continuous compliance workflows. Collaborate closely with architects, DevOps/platform engineers, and data product teams to deliver end-to-end solutions. Company Benefits & Culture
Automated workflows that make data services easy to deploy and operate on OCP/Kubernetes. Security controls that are built-in, not bolted-on-policy enforcement, least privilege, auditability, and compliance automation. Improved reliability and reduced operational overhead through standardization and automation. Strong cross-team alignment between platform engineering, data teams, and security stakeholders.
For immediate consideration please click APPLY to begin the screening process with Alex.
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.