Senior Data Platform / Analytics Engineering FocusWebstaurantStore • United States
Senior Data Platform / Analytics Engineering Focus
WebstaurantStore
- United States
- United States
À propos
Role Overview As a Senior Data Platform Engineer you will architect and own the next generation of our enterprise data integration and ingestion platform. You will lead the creation of a unified ingestion framework into Snowflake, build scalable, repeatable, cloud‑native data pipelines, guide the transition from SQL Server to Snowflake, mentor engineers, establish engineering best practices, and collaborate with stakeholders to deliver reliable, production‑grade cloud data solutions.
Responsibilities Define and drive the architecture of a standardized cloud data ingestion framework into Snowflake.
Design scalable, secure, and reusable ingestion patterns for diverse sources (APIs, databases, cloud storage, flat files, SharePoint).
Build and maintain Microsoft Fabric Pipelines to support enterprise‑scale orchestration and workflow automation.
Develop Python‑based ingestion utilities, automation scripts, and reusable components.
Establish best practices for ingestion performance, observability, resiliency, error handling, logging, and monitoring.
Analyze existing ingestion processes and lead modernization, consolidation, and optimization initiatives.
Guide the organization through the transition from SQL Server to Snowflake.
Serve as the technical authority for data integration and ingestion architecture, mentoring and coaching BI developers.
Conduct code reviews, provide architectural guidance, and ensure adherence to data engineering standards and DevOps principles.
Collaborate with BI Managers to align resources, timelines, and priorities.
Promote a culture of knowledge sharing, continuous improvement, and engineering excellence.
Partner with analysts and business stakeholders to understand ingestion requirements and translate them into scalable cloud data solutions.
Lead the planning, design, and analysis of ingestion workflows and data engineering projects.
Evaluate new data sources and determine optimal ingestion strategies.
Contribute to the long‑term data platform roadmap and cloud architecture strategy.
Ensure solutions meet enterprise standards for data quality, governance, and security.
Physical Requirements Work is performed while sitting/standing and interfacing with a personal computer.
Requires the ability to communicate effectively using speech, vision, and hearing.
Requires regular use of hands for simple grasping and fine manipulation.
Requires occasional bending, squatting, crawling, climbing, and reaching.
Requires the ability to occasionally lift, carry, push, or pull medium weights up to 50 lbs.
Remote Work Qualifications Reliable, secure high‑speed internet connection (at least 75 mbps download / 10 mbps upload).
Access to a home router and modem.
Dedicated home office space that is noise and distraction free.
Physical address required (no PO Boxes).
Ability and desire to work and communicate with team members via chat, webcam, etc.
Legal resident of one of the listed states (listed abbreviations).
W2 employment only; H‑1B visa sponsorship not available.
Experience Bachelor’s degree in Computer Science, Information Systems, Engineering, or related field; or equivalent professional experience.
7–10 years of professional data engineering experience.
5+ years with Snowflake (ingestion, transformation, optimization, platform performance tuning).
4–6 years building data pipelines using Microsoft Fabric Pipelines or equivalent.
4–6 years professional Python experience focused on automation, ingestion utilities, and framework development.
5+ years of enterprise SQL experience, including advanced SQL Server and CDC implementation.
5+ years of Agile/Scrum experience.
3+ years leading data engineering initiatives or serving as a technical lead on ingestion projects.
Experience with event‑driven architecture, streaming ingestion, or data contract–based integration models.
Experience using Qlik Replicate, dbt Cloud, and Microsoft Azure.
Certifications in dbt Cloud, Snowflake, or Microsoft Fabric are a plus.
Desired Traits & Skills Deep expertise in building and maintaining cloud‑based data integration pipelines.
Strong proficiency with Snowflake architecture and ingestion patterns.
Advanced Python skills for automation and framework development.
Hands‑on experience with Microsoft Fabric Pipelines.
Strong data modeling, warehousing, and ETL/ELT best practices knowledge.
Leadership ability to mentor and influence engineering direction.
Excellent analytical and problem‑solving skills.
Strong communication skills for translating complex technical concepts.
Ability to manage multiple priorities in a fast‑paced environment.
High adaptability, emotional intelligence, and collaborative mindset.
Customer‑focused approach with commitment to high‑quality solutions.
Company Overview The Foodservice professional’s premier source for restaurant equipment, supplies, and knowledge online. Our purpose is to empower and equip people to run their businesses more profitably and efficiently.
Benefits Medical
Vision
Dental
PTO
Paid Maternity Leave
Paid Parental Leave
Life Insurance
Disability
Dependent Care FSA
401(k) matching
Employee Assistance Program
Wellness Incentives
Company Discounts
AT&T & Verizon Discount
Bonus Opportunities
Accident Insurance
Critical Illness Insurance
Adoption Assistance
On‑Site Amenities On‑site fitness centers
Dog‑friendly offices
#J-18808-Ljbffr
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.