About
AWS
(S3, Glue, Lambda) and
Databricks/Apache Spark
for batch and real-time streaming. Deep expertise in the
Snowflake Cloud Data Platform , including datamart development and SnowPro-level optimization Lead the implementation of
dbt (Core/Cloud)
and
Matillion
for robust ETL/ELT workflows. Leverage AWS best practices to ensure pipelines are cost-effective, reliable, and high-performing. Develop and maintain version-controlled data workflows using Git and automated deployment pipelines. Understanding of Datawarehouse (DWH) systems, and migration from DWH to data lakes/Snowflake Strong problem-solving skills and the ability to handle complex data challenges Build processes supporting data transformation, data structures, metadata, dependency and workload management Benefits Significant career development opportunities exist as the company grows. The position offers a unique opportunity to be part of a small, fast-growing, challenging and entrepreneurial environment, with a high degree of individual responsibility. Tiger Analytics provides equal employment opportunities to applicants and employees without regard to race, color, religion, age, sex, sexual orientation, gender identity/expression, pregnancy, national origin, ancestry, marital status, protected veteran status, disability status, or any other basis as protected by federal, state, or local law.
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.