Dieses Stellenangebot ist nicht mehr verfügbar
Über
Data Engineer
Location:
Dallas, TX / Plano, TX (Onsite)
Job Summary
We are seeking a
Data Engineer
with strong experience in designing, building, and maintaining scalable data pipelines and ETL frameworks. The ideal candidate has a solid background in
data modeling, SQL, Python, and cloud data platforms (Azure, AWS, or GCP) , and works well in Agile environments to deliver data solutions supporting analytics, BI, and machine learning initiatives.
Key Responsibilities
Design, develop, and maintain
data pipelines
and
ETL processes
to support analytics and business applications. Build and optimize
data models , ensuring efficient data flow and transformation. Develop and maintain scalable
data warehouses / data lakes
using cloud platforms (Azure, AWS, or GCP). Collaborate with
data analysts, data scientists, and business teams
to understand requirements and translate them into technical solutions. Implement
data validation, quality checks, and governance standards . Monitor, troubleshoot, and optimize data systems for performance and scalability. Integrate data from multiple sources including APIs, flat files, databases, and third-party systems. Ensure
data security, integrity, and compliance
with corporate standards. Work closely with DevOps teams to automate deployments using CI/CD tools. Required Skills
5-8 years of experience as a
Data Engineer
or
ETL Developer . Strong experience with
SQL ,
Python , and data transformation logic. Proficiency with
ETL tools
(ADF, Informatica, Talend, or Apache Airflow). Experience with
cloud platforms
- preferably
Azure Data Factory ,
Azure Synapse ,
Databricks , or
AWS Glue / Redshift . Solid understanding of
data warehousing concepts
(Kimball, star schema, SCD). Hands-on with
big data frameworks
(Spark, Hadoop) and
NoSQL databases
(MongoDB, Cassandra). Familiarity with
version control (Git)
and
CI/CD pipelines
(Azure DevOps, Jenkins). Strong analytical, communication, and problem-solving skills. Preferred Qualifications
Bachelor's or Master's degree in Computer Science, Data Engineering, or a related field. Experience with
containerization (Docker, Kubernetes)
and
Infrastructure as Code (Terraform) . Exposure to
streaming data platforms
such as
Kafka or Kinesis . Experience working in
Agile/Scrum
environments. Certification in
Azure Data Engineer Associate ,
AWS Data Analytics , or
GCP Professional Data Engineer
is a plus. Soft Skills
Strong attention to detail and documentation. Effective communication with technical and business stakeholders. Ability to manage multiple priorities in a fast-paced environment. Team player with a proactive and collaborative attitude.
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.