This job offer is no longer available
About
The selected candidate will be required to work on several different APIs and publications of data sets and must be able to develop in an Agile methodology. Backend API Development and Maintenance Data Storage and ETL Engineering Working in an AWS Cloud base environment Unit Test Writing Working with a Scrum Team using Agile Writing documentation in Confluence and using JIRA for User Stories. Performance Testing Participate in all team meetings Requirements:
Bachelor's Degree 5+ years of professional software development experience Candidate must be able to obtain and maintain a Federal Public Trust Candidate must reside in the U.S., be authorized to work in the U.S., and all work must be performed in the U.S. Candidate must have lived in the U.S. for three (3) full years out of the last five (5) years TypeScript / JavaScript — backend services, async patterns, Node.js runtime Python 3 — data engineering, ETL pipelines, type hints, abstract base classes SQL — analytical queries, schema design, query optimization across PostgreSQL and MySQL NestJS Framework — modules, controllers, services, dependency injection, guards, middleware, decorators RESTful API design — resource modeling, HTTP semantics, versioning ETL pipeline design — extract → transform → validate → publish lifecycle; idempotency patterns; runtime business rule validation S3 — file storage, S3A filesystem integration with Spark, lifecycle conventions Docker — multi-stage Dockerfiles, docker-compose for local dev clusters, environment parity with production runtimes ORM proficiency — TypeORM (entity modeling, migrations, query builder, transactions) Authentication & authorization — JWT/Bearer tokens and policy-based authz with role/claim evaluation Apache Spark (PySpark) — distributed compute, DataFrame I/O, Spark SQL, EMR Serverless job configuration and submission Pandas / NumPy — in-process data transformation, vectorized operations, statistical aggregations Vitest and Jest — unit and integration testing, high coverage discipline (95%+ thresholds) pytest — Python unit and integration testing; mocking AWS services Structured logging — contextual request/job logging APM tooling — Datadog familiarity a plus Dependency security — Snyk, CVE remediation, automated dependency updates (Dependabot) PostgreSQL — schema design, JDBC integration, query optimization MySQL / Aurora MySQL — schema design, indexing, migrations Amazon Redshift — analytical SQL, serverless cluster connectivity, credential management AWS CodeBuild — CI/CD pipeline authoring, multi-step buildspecs, secret injection EMR Serverless — PySpark job submission, monitoring, custom Python SSM Parameter Store — runtime secret and config injection TypeScript linting — ESLint 9, TypeScript ESLint, Prettier, Husky + lint-staged pre-commit hooks Python linting — Ruff (lint + format), isort, pip-compile for deterministic dependency pinning Federal Government contracting work experience Prior experience in consulting or healthcare highly preferred. Benefits:
Reasonable Accommodations are available, including, but not limited to, for disabled veterans, individuals with disabilities, and individuals with sincerely held religious beliefs, in all phases of the application and employment process.
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.