Über
Plano Employment Type:
FTC Experience Level:
Senior
Job Summary We are seeking a highly skilled
Data Engineer
to design, build, and optimize data pipelines and infrastructure that support advanced analytics and business intelligence initiatives. The ideal candidate will have hands-on experience with
Python, Snowflake, Databricks, and AWS services
such as
Glue, Step Functions, and CloudWatch , with a strong understanding of data modeling, ETL frameworks, and cloud-based data solutions.
Key Responsibilities
Design, develop, and maintain
scalable data pipelines
and ETL processes using
Python, Databricks, and AWS Glue . Build and optimize
data warehouses and data lakes
on
Snowflake
and
AWS
platforms. Implement
data ingestion, transformation, and orchestration workflows
using
AWS Step Functions, Lambda, and Glue Jobs . Monitor, troubleshoot, and optimize pipeline performance using
AWS CloudWatch
and other monitoring tools. Collaborate with data analysts, data scientists, and business stakeholders to define data requirements and deliver reliable datasets. Ensure
data quality, governance, and security
across all data systems and workflows. Automate repetitive data engineering tasks and contribute to building
reusable frameworks and templates . Support continuous improvement by adopting best practices for
CI/CD , version control, and infrastructure-as-code (IaC).
Required Skills & Qualifications
Bachelor's or Master's degree in
Computer Science, Data Engineering, or a related field . 3-7 years of experience
in data engineering or related roles. Strong programming skills in
Python
for ETL, data processing, and automation. Proven experience with
Snowflake
- including data modeling, query optimization, and performance tuning. Hands-on experience with
Databricks (PySpark, Delta Lake)
for big data processing and analytics. Experience with
AWS Cloud services , particularly:
AWS Glue
(ETL development) AWS Step Functions
(workflow orchestration) AWS CloudWatch
(monitoring and logging) Optional:
S3, Lambda, Athena, Redshift, IAM
Knowledge of
SQL
and experience in writing complex queries and stored procedures. Understanding of
data integration, data warehousing, and data lake architectures . Experience with
version control (Git)
and
DevOps pipelines .
Preferred Qualifications
Experience with
Terraform or CloudFormation
for infrastructure-as-code. Familiarity with
Airflow or other orchestration tools . Exposure to
real-time data streaming
(Kafka, Kinesis). Strong problem-solving skills and the ability to work in a fast-paced, agile environment.
Diverse Lynx LLC is an Equal Employment Opportunity employer. All qualified applicants will receive due consideration for employment without any discrimination. All applicants will be evaluated solely on the basis of their ability, competence and their proven capability to perform the functions outlined in the corresponding role. We promote and support a diverse workforce across all levels in the company.
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klicken Sie auf „Jetzt Bewerben“, um Ihre Bewerbung direkt auf deren Website einzureichen.