Über
IT & Technology
Job Requisition:
492479
Address:
USA-SC-Mauldin-208 Bi‑Lo Boulevard
Store Code:
Architecture, Strategy & Development (5119314)
Ahold Delhaize USA, a division of global food retailer Ahold Delhaize, is part of the U.S. family of brands, which includes five leading omnichannel grocery brands—Food Lion, Giant Food, The GIANT Company, Hannaford and Stop & Shop. Our associates support the brands with a wide range of services, including Finance, Legal, Sustainability, Commercial, Digital and E‑commerce, Technology and more.
Primary Purpose Designs and implements reusable data patterns, automates quality and governance checks, and strengthens pipeline reliability and observability.
Our flexible/hybrid work schedule includes 3 in‑person days at one of our core locations and 2 remote days. Our core office locations include Salisbury, NC, Chicago, IL, and Quincy, MA.
Applicants must be currently authorized to work in the United States on a full‑time basis.
Duties & Responsibilities
Design modular ingestion and transformation components for common use cases, emphasizing reuse and maintainability.
Participate in developing streaming data applications (Kafka), data transformation and data pipelines.
Build automated data tests, quality rules, and lineage capture into pipelines; contribute to CI/CD workflows that validate and deploy changes safely.
Operate monitoring, logging, and alerting; tune signals to reduce noise and speed triage (including observability tooling such as DataDog where used).
Execute upgrades and changes using low‑risk deployment practices, contributing automation scripts and repeatable release steps.
Support cloud service integration and container/orchestration patterns as needed to improve scalability and runtime performance.
Partner with consumers to align data models with user needs and usability principles; provide reusable datasets and components.
Maintain infrastructure as code and update environment configurations using approved tools and templates.
Document standards, runbooks, and troubleshooting guides; contribute to internal enablement materials for platform usage.
Contribute to Agile planning and continuous improvement; support operational needs that keep platform services running.
May be called upon to support critical escalations and must be available during urgent IT incidents as needed.
Qualifications
Bachelor's degree or equivalent years of work experience.
5+ years in data engineering or closely related roles.
Experience with Kafka and Databricks, or comparable.
Hands on experience with orchestration, distributed processing, and storage.
Proficiency in scripting/automation and version control.
Clear problem solving and communication skills.
Preferred Qualifications
Exposure to streaming and change data capture patterns.
Awareness of data privacy and access control practices.
Hands on experience with Databricks Declarative Pipelines and meta‑data driven architecture.
Salary Range: $125,040 - $187,560
Actual compensation offered to a candidate may vary based on their unique qualifications and experience, internal equity, and market conditions. Final compensation decisions will be made in accordance with company policies and applicable laws.
#J-18808-Ljbffr
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.