XX
Sr. Data EngineerTavantUnited States
XX

Sr. Data Engineer

Tavant
  • US
    United States
  • US
    United States

About

Job Description
The Sr. Data Engineer will contribute to the Company's success by partnering with business, analytics and infrastructure teams to design and build data pipelines to facilitate measuring various KPIs & metrics. Collaborating across disciplines, they will identify internal / external data sources, design table structures, define ETL strategy & implement automated Data Quality checks.
Basic Qualifications:
5+ years of relevant data engineering experience. Strong understanding of data modeling principles including Dimensional modeling, data normalization principles. Good understanding of SQL Engines and able to conduct advanced performance tuning. Ability to think strategically, analyze and interpret market and consumer information. Strong communication skills - written and verbal presentations. Excellent conceptual and analytical reasoning competencies. Comfortable working in a fast-paced and highly collaborative environment. Familiarity with Agile Scrum principles and ceremonies Preferred Qualifications:
Bachelor's/master's degree in computer science, Engineering, or related field. Strong analytical and problem-solving skills. Effective communication and teamwork abilities. Role
Must have skills:
Data Engineering skills using -
Complex or Advanced SQL queries Python Spark Snowflake Databricks Airflow or Prefect
Experience with at least one cloud platform (AWS / Azure / GCP) Nice to have skills:
Familiarity with tools like Datorama / Improvado / FiveTran for integrating, harmonizing, and visualizing data across platforms. Data Modeling (e.g., Dimensional Modeling) Experience working with Kafka Familiarity with CI/CD tools (e.g., Jenkins, GitHub Actions, etc.) & Docker containers Exposure to monitoring tools like Datadog Responsibilities:
Partner with technical and non-technical colleagues to understand data and reporting requirements Work with engineering teams to collect required data from internal and external systems Design table structures and create data pipelines to build performant data solutions that are reliable and scalable in a fast growing data ecosystem Develop automated data quality checks Develop and maintain ETL routines using ETL and orchestration tools such as Airflow Perform ad hoc analysis as necessary. Perform SQL and ETL tuning as necessary.
  • United States

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.