This job offer is no longer available
About
Key Responsibilities:
Architect and manage modern data pipelines, warehouses, and lakehouses (Snowflake, Databricks, BigQuery). Design and maintain ETL workflows that ensure data accuracy, quality, and accessibility. Develop advanced analytics and machine learning models to uncover patterns and predict outcomes. Create dashboards and visualizations that enable real-time insights for leadership teams. Partner with business stakeholders to translate data outputs into measurable KPIs and ROI metrics. Ideal Background:
5-10 years of experience in data engineering, analytics, or data science. Strong programming skills (Python, SQL, R), with experience in big data tools (Spark, Airflow) and BI platforms (Power BI, Tableau, Looker). Advanced degree in Computer Science, Statistics, or a related field preferred.
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.