This job offer is no longer available
About
Position :- GCP Data Engineer (Health Care Background Must)
Location: Remote
Only Full Time (USC & GC)
Summary:
Strong experience architecting enterprise data platforms on Google Cloud (GCP). The architect will work as a strategic technical partner to design and build a GCP BigQuery-based Data Lake & Data Warehouse ecosystem.
The role requires deep hands-on expertise in data ingestion, transformation, modeling, enrichment, and governance, combined with a strong understanding of clinical healthcare data standards, interoperability, and cloud architecture best practices.
Key Responsibilities:
1. Data Lake & Data Platform Architecture (GCP)
- Architect and design an enterprise-grade GCP-based data lakehouse leveraging BigQuery, GCS, Dataproc, Dataflow, Pub/Sub, Cloud Composer, and BigQuery Omni.
- Define data ingestion, hydration, curation, processing and enrichment strategies for large-scale structured, semi-structured, and unstructured datasets.
- Create data domain models, canonical models, and consumption-ready datasets for analytics, AI/ML, and operational data products.
- Design federated data layers and self-service data products for downstream consumers.
2. Data Ingestion & Pipelines
- Architect batch, near-real-time, and streaming ingestion pipelines using GCP Cloud Dataflow, Pub/Sub, and Dataproc.
- Set up data ingestion for clinical (EHR/EMR, LIS, RIS/PACS) datasets including HL7, FHIR, CCD, DICOM formats.
- Build ingestion pipelines for non-clinical systems (ERP, HR, payroll, supply chain, finance).
- Architect ingestion from medical devices, IoT, remote patient monitoring, and wearables leveraging IoMT patterns.
- Manage on-prem → cloud migration pipelines, hybrid cloud data movement, VPN/Interconnect connectivity, and data
Languages
- English
This job was posted by one of our partners. You can view the original job source here.