XX
Full-Stack + Data Engineer Needed for Custom Options FlowFreelanceJobsCanada

Dieses Stellenangebot ist nicht mehr verfügbar

XX

Full-Stack + Data Engineer Needed for Custom Options Flow

FreelanceJobs
  • CA
    Canada
  • CA
    Canada

Über

I'm an active options trader currently using multiple platforms (Market Chameleon, Unusual Whales, Tradytics, FlowAlgo, Barchart). I want to hire one senior engineer to build a custom web-based trading dashboard that consolidates the features I use most—especially:
Dark pool / off-exchange prints (as provided by a licensed data API)
Options flow including large trades / block / sweep tags (as available from provider)
Large order / sweep style alerts
Up-to-date news headlines
Stock + options screeners
Alerting engine (rules-based first; "AI-generated" should focus on summaries/explanations and trade-context, not random buy/sell signals)
Key requirement: the system must be designed to integrate data from one provider (preferred) or two providers max via API (REST + WebSocket). You will build the app so I can control watchlists, saved scanners, alerts, and view everything in one place.
What You'll Build (Core Features)
MVP (phase 1):
Real-time event ingestion (WebSocket) + backfill/reconciliation (REST)
Unified event timeline ("tape"): options trades + dark pool prints + news
Ticker workspace page: key recent events + filters + basic charts/tables
Saved filters / scanners (stock + options)
Alerts engine:
user-defined rules (filters + thresholds)
dedupe/cooldown (avoid spam)
in-app notifications (and optionally Discord/Telegram/email integration)
Admin/config area: manage watchlists, saved views, alert rules
Phase 2 (after MVP):
Deeper analytics (flow clustering, repeat prints, unusual activity scoring)
"AI" layer for summarizing why alerts fired + contextual news recap
Production deployment + monitoring
Data Sources / Integrations
You should be comfortable integrating paid market data APIs (REST + WebSockets), handling:
rate limits / retries
reconnect logic
timestamp/symbol normalization
vendor quirks and outages
keeping API keys secure (never in frontend)
I'm currently evaluating providers (examples: Unusual Whales API / Intrinio, and others if needed). You'll build this in a vendor-agnostic way so we can swap providers if required.
Tech Preferences (Flexible if you justify choices)
Backend: Python (FastAPI) or (NestJS/Express) or Go
Frontend: React +
DB: Postgres (optionally TimescaleDB) or ClickHouse for event analytics
Cache/Queue: Redis preferred
Infra: Docker + docker-compose for local, deployable to a cloud VM/container later
Required Qualifications (Must Have)
Proven experience building real-time data applications (WebSockets, event streams)
Strong backend + API design skills (auth, caching, pagination, resilience)
Strong database/time-series/event modeling experience
Strong frontend dashboard experience (tables, filters, state, real-time updates)
Security best practices (secrets management, no leaked keys)
Excellent communication + documentation habits
Nice to Have
Market data or trading tools experience (options basics, "dark pool" prints conceptually)
Experience with TimescaleDB/ClickHouse/Kafka/NATS
Observability (metrics/logging/tracing)
Experience implementing LLM features responsibly (summaries/explanations with sources)
Deliverables
Working application (MVP) with documented setup
Source code repo + clean commit history
Architecture diagram + data model
Tests for critical components (ingestion, alerts)
Deployment instructions (local via Docker; cloud-ready plan)
Contract duration of less than 1 month. with 30 hours per week.
Mandatory skills: Python, JavaScript, API, PHP, API Integration, , Database Design, Data Integration, Data Preprocessing, Database Architecture
  • Canada

Sprachkenntnisse

  • English
Hinweis für Nutzer

Dieses Stellenangebot wurde von einem unserer Partner veröffentlicht. Sie können das Originalangebot einsehen hier.