This job offer is no longer available
About
We are seeking an experienced
Data Engineer
to join a large, enterprise-scale environment within the
financial services industry . This role is hands-on and focused on building, supporting, and enhancing data solutions that rely heavily on relational databases, AWS, and modern ETL processes. The ideal candidate is comfortable working close to the database layer, understands complex PL/SQL logic, and can partner effectively with distributed Agile teams to build and optimize data solutions that support analytics and applications. The role will focus on SQL and PL/SQL development, data modeling, and ETL/ELT processes across relational and NoSQL data stores. The engineer will collaborate within global Agile teams, leverage AWS services, and contribute to CI/CD automation to deliver reliable, scalable data pipelines.
This is an hybrid (part remote, part onsite) role in Westlake, TX.
Due to client requirements, applicants must be willing and able to work on a w2 basis.
For our w2 consultants, we offer a great benefits package that includes Medical, Dental, and Vision benefits, 401k with company matching, and life insurance.
Rate: $60.00 to $70.00/hr. w2
Responsibilities:
Strong experience working with
Oracle RDBMS , including
PL/SQL , with the ability to read, understand, and debug complex database logic Design, develop, and support
ETL/ELT processes , leveraging tools and frameworks such as
AWS Glue
and
Python Hands-on experience with
AWS services
including EC2, S3, Lambda, SNS, and SQS Build and maintain
CI/CD pipelines
using tools such as
Git, Maven, Jenkins, Docker , and related DevOps technologies Develop and support data integrations and pipelines in a production environment Validate, monitor, and troubleshoot data workflows across development, testing, and production Collaborate with global, cross-functional teams in an
Agile
delivery model Experience Requirements:
Demonstrated experience developing and debugging SQL and PL/SQL. Demonstrated experience as a Data Engineer or similar role in an enterprise environment Strong background in
data modeling , operational databases, and end-to-end data pipeline development Proven ability to work independently while contributing effectively within a team Hands-on experience with relational databases such as Oracle and PostgreSQL, and NoSQL databases such as DynamoDB and Elasticsearch. Thorough understanding of data modeling and ETL/ELT concepts and tools. Experience programming with Python, Java, and Unix. Experience with DevOps or CI/CD pipelines using Git, Maven, Jenkins, and related tools. Experience with AWS services including EC2, S3, Lambda, SNS, and SQS. Ability to work effectively on global Agile teams with strong written and verbal communication. Ability to validate, monitor, and troubleshoot issues in development, test, and production. Knowledge of Kafka or similar streaming and messaging technologies such as Kinesis, SNS, or SQS (preferred).
(Nice to Have)
Experience building
API integrations
to databases using
Java Exposure to
streaming or messaging platforms
such as
Kafka
(or similar technologies) Familiarity with NoSQL databases (e.g., DynamoDB, Elasticsearch) AWS certifications (Associate, Professional, or Specialty)
Education Requirements:
Bachelor's or Master's degree in a technology-related field such as Computer Science or Computer Engineering. AWS certification (Associate, Professional, or Specialty) (preferred).
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.