Data Engineer - Snowflake & Activation PlatformsFrontier Credit Union • Snowflake, Arizona, United States
Data Engineer - Snowflake & Activation Platforms
Frontier Credit Union
- Snowflake, Arizona, United States
- Snowflake, Arizona, United States
About
Position Overview The Data Engineer is responsible for designing, building, and operating the downstream data layer that powers analytics, Salesforce enablement, marketing activation, lending insights, automation, and AI initiatives at Frontier Credit Union. This role owns the transformation, modeling, validation, and operationalization of data after it has been ingested into Snowflake. Upstream data sourcing, APIs, and ingestion pipelines are owned by a dedicated Software Engineering function. This role partners closely with that team to define data contracts and ensure ingested data is production‑ready for downstream use. Activation platforms include Salesforce CRM, Salesforce Marketing Cloud, internal data applications, and AI‑enabled workflows. The focus of this role is ensuring data is trustworthy, well‑modeled, and usable by these systems.
System Ownership & Organizational Impact
Serves as the primary owner of Snowflake‑based transformations and downstream data products.
Translates raw ingested data into trusted, governed, analytics‑ready datasets.
Ensures downstream consumers (Salesforce, Marketing Cloud, BI tools, internal apps, and automation workflows) receive consistent and reliable data.
Identifies and resolves data quality issues arising post‑ingestion.
Reduces downstream friction by enforcing standards, documentation, and observability across data layers.
Essential Functions & Responsibilities Snowflake Transformations & Data Modeling
Design, build, and maintain Snowflake schemas, tables, views, and transformation layers.
Implement scalable modeling patterns to support CRM, marketing, lending, and operational analytics.
Write and optimize complex SQL transformations with a focus on correctness, performance, and cost efficiency.
Maintain consistent metric definitions and reusable logic across business domains.
Downstream Data Pipelines & Orchestration
Build and maintain transformation pipelines that convert ingested data into curated, business‑ready datasets.
Implement dependency management, scheduling, and monitoring for downstream workflows.
Support incremental processing, backfills, and reprocessing as business needs evolve.
Partner with upstream engineers to define schema expectations, freshness SLAs, and data contracts.
Salesforce & Activation Enablement
Prepare and maintain datasets that support Salesforce CRM and Salesforce Marketing Cloud use cases.
Partner with Salesforce Administrators and Marketing teams to ensure downstream data supports segmentation, analytics, and reporting needs.
Support identity resolution, deduplication logic, and business rules at the data layer.
Ensure consistency between Snowflake‑curated data and downstream operational systems.
This role supports Salesforce through data products and pipelines, not platform administration or campaign execution.
Data Quality, Validation & Observability
Implement data quality checks, validation rules, and anomaly detection for critical datasets.
Monitor data freshness, volume, and schema stability for downstream tables.
Build visibility into data readiness for analytics, automation, and operational use.
Document data models, definitions, and ownership for key datasets.
Data Applications & Enablement Tooling
Build internal data tools and lightweight applications (e.g., Streamlit) to support operations, analytics, and decision‑making.
Develop tools for pipeline health monitoring, data readiness checks, and operational reporting.
Translate business questions into durable data products rather than one‑off analyses.
Automation & AI Enablement
Prepare feature‑ready datasets to support automation and AI initiatives.
Partner with analytics and AI stakeholders to operationalize models and analytical outputs.
Ensure downstream pipelines support reproducibility, auditability, and long‑term maintainability.
Operational Discipline & Change Management
Follow established development, testing, and deployment standards for data transformations.
Use version control and documentation standards to support maintainability.
Participate in incident response related to downstream data failures.
Continuously improve reliability, performance, and clarity of data products.
Project Collaboration & Communication
Work closely with Software Engineers responsible for data ingestion and APIs.
Collaborate with BI, Salesforce, Marketing, Lending, and Operations teams.
Participate in requirements discovery, design reviews, and prioritization discussions.
Communicate technical tradeoffs and constraints clearly to non‑technical stakeholders.
Requirements Knowledge, Skills & Abilities Technical Skills
Advanced SQL skills, including complex transformations, window functions, and performance tuning.
Strong Python experience for data pipelines, automation, and data tooling.
Hands‑on experience with Snowflake in production environments.
Experience building downstream data pipelines and curated data layers.
Familiarity with orchestration, scheduling, and transformation frameworks.
Experience supporting downstream consumers such as BI tools, CRM platforms, or internal applications.
Professional Skills
Strong ownership mindset for data products
Ability to operate independently with clear accountability.
Strong problem‑solving and systems‑thinking skills.
Clear communication across technical and business audiences.
Comfort working with sensitive data in regulated environments.
Education & Experience Requirements Required
Bachelor’s degree in Computer Science, Engineering, Information Systems, or equivalent professional experience.
3–6 years of experience in data engineering, analytics engineering, or a closely related role.
Demonstrated experience building Snowflake‑based transformations and data products.
Strong production experience with SQL and Python.
Preferred
Experience supporting Salesforce CRM or Salesforce Marketing Cloud as downstream consumers.
Experience building internal data tools or applications (Streamlit preferred)
Experience in financial services or other regulated industries.
Experience supporting automation or AI‑driven analytics workflows.
#J-18808-Ljbffr
Languages
- English
Notice for Users
This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.