Über
Senior Data Engineer to support a Fortune 100 financial services client’s real-time intelligence initiatives. In this role, You’ll work in a
virtualization-first architecture
using Denodo , building both virtual and physical data products that are governed, high-quality, and AI-ready. You will play a critical role in establishing
trusted, reusable data assets —each backed by formal data contracts, quality controls, and full lineage. What You’ll Do
Build governed virtual data models in Denodo , including cross-system joins, canonical schemas, ABAC policies, and column-level masking Configure and optimize semantic layer connectivity (JDBC/ODBC, connection pooling, failover) Implement data quality rules (completeness, validity, uniqueness, anomaly detection, scoring) Configure data catalog, lineage, and marketplace publishing Build physical data products: Source extraction (mainframe, CDC, COBOL copybooks) Data Lakehouse storage using Apache Iceberg on AWS S3 Define and implement
machine-readable data contracts
(YAML/JSON), including schema guarantees, SLAs, and dependency tracking Develop APIs, SLA monitoring dashboards, reconciliation processes, and production-grade delivery patterns Partner with client teams through demos, rotations, and structured knowledge transfer Leverage AI-assisted development (Claude Code) to accelerate delivery and improve productivity Required Qualifications
5+ years of data engineering experience in enterprise-scale environments Strong expertise with Denodo (VQL development, query optimization, caching, governance) Experience with Apache Iceberg on AWS (S3, Glue, Athena), including schema evolution and partitioning Hands‑on experience with Informatica Experience implementing data quality frameworks Strong SQL and programming skills (Python, Spark; dbt preferred) Experience with CDC technologies (e.g., Debezium, Informatica CDC, or equivalent) Familiarity with API development (FastAPI, Node.js) and OpenAPI standards Experience working with AI‑assisted development tools (Claude Code preferred; training available) Preferred Qualifications
Experience with mainframe data environments (COBOL copybooks, VSAM, AIX/AS400 extraction) Financial services or insurance domain experience Familiarity with multi‑tier data certification frameworks (e.g., Bronze/Silver/Gold or Foundation/Production/Enterprise) Experience with API gateways and AI integration patterns (e.g., Kong Gateway , APIGEE, MCP) Experience designing and enforcing formal data contracts and dependency management Consulting or client‑facing experience with structured knowledge transfer AWS certifications (Solutions Architect, Data Analytics) are a plus About BDIPlus
BDIPlus is a data and AI consulting firm with deep experience delivering for Fortune 500 clients, including American Express and Morgan Stanley . We specialize in
enterprise data products, identity resolution, semantic layers, and event‑driven architecture . Our engineers are
AI‑native , leveraging tools like Claude Code from day one to deliver faster, higher-quality solutions than traditional consulting teams. BDIPlus is an equal opportunity employer. We celebrate diversity and are committed to creating an inclusive environment for all employees.
#J-18808-Ljbffr
Sprachkenntnisse
- English
Hinweis für Nutzer
Dieses Stellenangebot stammt von einer Partnerplattform von TieTalent. Klick auf „Jetzt Bewerben”, um deine Bewerbung direkt auf deren Website einzureichen.