XX
Enterprise AI-Ready Data Architect / Senior Data EngineerArtechUnited States
XX

Enterprise AI-Ready Data Architect / Senior Data Engineer

Artech
  • US
    United States
  • US
    United States

About

Job Title: Enterprise AI-Ready Data Architect / Senior Data Engineer Location: East Hanover, NJ (Onsite: 3days & 2 days remote a week) Duration: 6+ Months Pay Rate : $53.57/Hour - $64.28/Hour
Job Description: The Enterprise AI-Ready Data Architect / Senior Data Engineer is a hybrid role with a focus on enterprise data architecture, AI integration, and hands-on data engineering. You will design and implement AI-ready, analytics-ready data products and semantic layers (including ontologies) that enable scalable enterprise analytics and integration with AI agents and GenAI use cases. You will embed governance-by-design (quality, lineage, contracts, observability) and partner closely with business and technology stakeholders-in pharmaceutical domains.
Key Responsibilities 1) Enterprise Data Architecture (AI-Ready by Design) • Define and deliver strategic enterprise data architectures that scale and support AI-ready outcomes. • Design data workflows capturing as-is and to-be states for enterprise modernization. • Establish architecture patterns for: • Semantic Context Layer • Data Warehouses, Data Lakehouses • Data Catalogs and Data Marketplaces • Event-driven and metadata-driven architectures • Distributed data management (Data Mesh, Data Fabric, Domain-Driven Design) • Streaming data management
2) Data Products, Semantic Products, and Master Data • Design data products that are AI-ready and reusable across domains and use cases. • Build and govern semantic models, metrics-first modeling, and ontologies (knowledge graph concepts). • Deliver Master Data Management (MDM) capabilities and align master/reference data with business needs. • Support structured and unstructured data management to enable broader AI and analytics capabilities.
3) AI Integration and GenAI Enablement • Enable contextual intelligence and data enrichment using: • Contextual retrieval, entity linking, enrichment using LLMs and embeddings • Vector search, RAG pipelines, and LLM-based enrichment • Implement graph-based approaches: • RDF, OWL, and SPARQL querying • Property graph / knowledge graph modeling for relationships and reasoning
4) Data Engineering Delivery • Design and implement robust ETL/ELT pipelines and orchestration frameworks. • Develop high-quality transformations and data modeling using: • Advanced SQL • Tools such as dbt, Airflow, Dataiku • Ensure production-grade engineering practices for performance, reliability, and maintainability across pipelines.
5) Governance and Standards (Embedded) • Implement open-source data standards across: • Data contracts • Data quality • Data lineage • Lead metadata-driven governance through metadata management, observability, and policy-aligned design.
Skills and Qualifications Core Technical Skills • Advanced SQL proficiency • Data platforms and governance tooling experience (one or more): • Snowflake, Databricks, Collibra, Salesforce • ELT/ETL and orchestration: • dbt, Airflow, Dataiku • BI and reporting: • Power BI • Cloud platforms: • AWS, Azure, GCP • Modern architecture and data management: • Data Mesh, Data Fabric, streaming, metadata-driven architecture • Graph and semantic technologies: • Knowledge graphs, property graphs (Neo4J), RDF/OWL, SPARQL, graph query languages
Domain and Modeling Expertise • Experience with data modeling techniques: • Conceptual, logical, physical modeling-preferably for the pharmaceutical industry • Semantic modeling, ontology design, and reusable metric layers • MDM concepts and implementation approaches
AI and GenAI Enablement Skills • Familiarity with GenAI technologies for enhancing analysis/reporting and data enrichment • Experience with embeddings, vector search, RAG patterns, and entity resolution/linking concepts
Nice to Have • Experience with Palantir platform
Recommended Certifications • CDMP (DAMA) • TOGAF • EDM Council frameworks: • DCAM, CDMC, Open Knowledge Graph, Data Ethics and Responsible AI
Qualifications • 10+ years of experience in data architecture, process automation, implementation and large-scale data engineering, ideally in pharmaceutical • Advanced technical engineering and hands-on experience in data modeling for OLAP, workflow automation, AI/ML integration • ETL pipeline design and development • Bachelor's degree in computer science, information technology, engineering, or data science • Strong problem-solving skills and attention to detail. • Excellent communication skills with the ability to work with senior stakeholders to translate business requirements to technical data requirements
  • United States

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.