This job offer is no longer available
About
This is not a build from scratch. We have a live, working real-time meeting analytics engine that integrates with Zoom and surfaces behavioral insights during meetings and in post-meeting reports. The core engine is production-grade, tested, and 85% complete. We need a senior developer to implement the final two feature modules, run end-to-end QA, and close the MVP.
What the engine does today (already built and working)
The platform connects to live Zoom meetings and processes audio, video, and transcript streams in real time. It runs facial emotion analysis, voice prosody analysis (arousal, valence, tension), and NLP-based behavioral detection simultaneously. During a meeting, the system surfaces live alerts, and behavioral nudges to participants. After the meeting, it generates comprehensive reports with signal timelines, flashpoint detection, and exportable data (PDF, CSV, JSON).
The working codebase includes:
• 709 files, approximately 93,500 lines of source code across Python and TypeScript
• 91 test files plus golden fixture validation harnesses
• Real-time WebSocket streaming with Redis pub/sub
• 12 behavioral bias detection rules already firing in production
• Live nudge engine with 6 rules, cognitive overload detection, and flashpoint detection
• Post-meeting report pipeline with PDF/CSV/JSON exports
• Full database schema (24 PostgreSQL tables) with psychometric snapshot persistence
• Docker-compose local dev environment that spins up in minutes
What we need you to build (the last 15%)
There are exactly two feature modules to implement and integrate into the existing engine. Both have detailed specs, acceptance criteria, golden fixtures, and bid packages already written. This is defined work, not open-ended discovery.
Feature 1: Cognitive Bias Surfacing (MindScope)
A two-stage LLM cascade that detects 12 specific cognitive biases in meeting transcripts in real time. Stage 1 runs a fast screening pass across all 12 biases per transcript segment. Stage 2 adjudicates the top candidates with evidence, confidence scoring, and severity ratings. Events above a confidence threshold trigger live nudges. Events between 0.55 and 0.70 are recorded for the post-meeting report. The module includes a replay adapter for offline testing against 24 curated fixture segments with pass/fail checks.
Partial code exists on a feature branch (8 Python files, approximately 777 lines). Your job is to merge it to main, complete any gaps against the acceptance criteria, wire it into the live and post-meeting pipelines, and pass the fixture suite.
Estimated scope: 30 to 60 hours.
Tech stack (you must know all of these)
• Backend API: Python 3.11, Flask, SQLAlchemy, pytest
• Real-time service: , TypeScript, WebSocket, Redis pub/sub
• Frontend: React 18, 14, TypeScript, hooks-based architecture
• Data: PostgreSQL, Redis
• Infrastructure: Docker, docker-compose, EC2
• AI integrations: Anthropic Claude API (NLP), Hume AI (audio/video), Zoom RTMS (media streams)
Do not apply if
• You have not taken over an existing production codebase and shipped features on a deadline
• You cannot read and work within a 93K-line monorepo without wanting to rewrite it
• You have no experience with real-time WebSocket/streaming architectures
• You are unfamiliar with LLM API integration (prompt engineering, structured JSON outputs, retry/timeout handling)
• You cannot overlap US Central or Eastern working hours for daily coordination
• You need more than 5 business days to start
Engagement structure
Hourly with weekly cap (25 to 30 hours per week). Milestone-driven. Total estimated scope: 45 to 75 hours across both features plus QA. Timeline target: 2 to 3 weeks. NDA required before repo and documentation access.
What you get from us on day one
• Complete architecture documentation with service diagrams
• Detailed spec packages for both features with acceptance criteria, golden fixtures, and schemas
• A production codebase audit (current as of today) identifying every open item
• Docker-compose environment that runs the full stack locally
• Direct access to the technical product owner for daily questions
To apply: answer these three questions
1. Describe a project where you took over a production codebase mid-sprint and shipped features. What was the LOC, stack, and timeline?
2. You inherit a feature branch with 777 lines of Python implementing an LLM cascade pipeline. It was never merged to main. The main branch has diverged by 51 commits. Walk me through your first 60 minutes.
3. Your hourly rate, availability (start date and hours per week), and timezone.
Generic proposals will be ignored. If your response does not answer all three questions with specifics, we will not reply.
Contract duration of 1 to 3 months. with 30 hours per week.
Mandatory skills: React, Backend API, Python 3.11, SQLAlchemy, pytest, Flask, , Amazon EC2
Languages
- English
Notice for Users
This job was posted by one of our partners. You can view the original job source here.