XX
Senior Data EngineerFreelanceJobsCanada
XX

Senior Data Engineer

FreelanceJobs
  • CA
    Canada
  • CA
    Canada

About

A leading cloud services firm is seeking a highly skilled Data Engineer for a high-impact engagement to implement a Relationship-Based Access Control (ReBAC) Minimum Viable Product (MVP). You will work in a "divide and conquer" co-development model alongside engineering teams to transition a finalized technical design into a production-ready reality.
This is a Cloudflare/GCP ecosystem.
The primary goal is to deliver a centralized access control mechanism scoped to a specific boundary system. This project moves the architecture from batch processing to an event-driven model, reducing data propagation latency from hours to near real-time. While the initial scope is specific, the solution must include "architectural stubbing" to support future integrations with broader ecosystem tools.
As the lead Data Engineer on this project, your work will be divided into four critical execution streams:
Source Ingestion Implementation: * Develop Ingestion Workers to process CRM platform events and internal employee data.
Implement HMAC-SHA256 signature validation for secure payload authentication.
Configure Queues to manage backpressure and Dead Letter Queues (DLQ).
Build a Backfill Workflow using Bulk APIs to migrate historical relationship data via cloud storage.
Data Persistence Layer Construction:
Provision and manage sharded SQL-based databases to serve as the Relationship Store.
Develop a Shard Router Library in TypeScript to route data across shards.
Implement a caching strategy using Key-Value (KV) namespaces for "hot" authorization relationships to reduce database read load.
Infrastructure & Automation:
Define all resources (Workers, Databases, KV, Queues) using Terraform for reproducible environments.
Establish SQL migration mechanisms within the CI/CD pipeline to manage schema changes across shards.
Validation & Handover:
Execute load tests to ensure an Ingestion Latency SLO of lesser than 60 seconds.
Create operational runbooks for shard management and incident response.
Required Skillset
Core Technical Stack: GCP, Cloudflare DBs
Cloud Platform Workers: Extensive experience with serverless functions, sharded databases, and cloud-native queuing systems.
Languages: Proficiency in TypeScript (for worker development and routing libraries) and SQL.
Data Integration: Proven experience with Bulk APIs and Event-Driven architectures.
Infrastructure as Code: Advanced Terraform skills for full environment provisioning.
Security: Understanding of HMAC authentication, secrets management via vault services, and the Principle of Least Privilege (PoLP).
Performance & DevOps:
Testing: Experience with load testing tools (e.g., Gatling) for performance benchmarking against latency targets.
CI/CD: Experience integrating build and deploy steps into unified CI/CD ecosystems.
Architecture: Familiarity with ReBAC/ABAC security models and policy-as-code.
Contract duration of 1 to 3 months.
Mandatory skills: Google Cloud Platform, Cloudflare, sharding, database schema, Database Caching, Database Architecture, Database Design, Data Migration, Data Transformation, Query Tuning
  • Canada

Languages

  • English
Notice for Users

This job comes from a TieTalent partner platform. Click "Apply Now" to submit your application directly on their site.