RQ09815 - Software Developer - Senior
Maarut Inc
- Toronto, Ontario, Canada
- Toronto, Ontario, Canada
À propos
Responsibilities:
- Design, build, and maintain secure, scalable Java services and APIs using Spring Boot.
- Translate technical requirements into production-grade application code, integration logic, and robust data access layers.
- Write clean, testable Java (unit, integration, regression), contribute to CI/CD pipelines, and support automated deployments.
- Design, build, and optimize data workflows – including SQL queries, ETL logic, and caching for reliability, integrity, and performance in production.
- Collaborate with data engineers and analysts to ensure service-layer alignment with enterprise data models and reporting needs.
- Diagnose and resolve production issues (performance, defects, incidents); participate in on-call / support rotations as needed.
- Review code, enforce engineering standards, document solutions, and mentor intermediate developers.
- Collaborate with architects, QA, product owners, and business SMEs in an iterative / Agile delivery model to plan, scope, and land increments.
- Apply AI/ML capabilities (LLMs, retrieval-augmented generation, classic ML models) to enhance existing Java services where appropriate.
- Design and consume AI-backed services (e.g., classification, summarization, recommendations, reasoning assistants) through secure REST integrations.
- Support model lifecycle activities such as monitoring output quality, drift awareness, and safe, auditable usage of AI features.
General Skills:
- Strong Java and Spring Boot experience building enterprise services at scale (API design, dependency management, error handling, observability, performance tuning).
- Advanced SQL fluency (Oracle, MySQL, PostgreSQL) — complex joins, window functions, data validation, and query optimization.
- Working knowledge of data modeling, ETL/ELT pipelines, and API-driven data integration.
- Hands-on experience with Git, automated testing, secure coding practices, code reviews, and CI/CD pipelines.
- Experience deploying containerized services (Docker) to managed platforms or Kubernetes; comfort with production-grade runtime concerns (logging, metrics, alerts).
- Ability to integrate third-party / platform services and expose them through hardened APIs.
- Familiarity with responsible use of AI services in production: PII handling, privacy controls, auditability, bias/safety considerations.
- Ability to translate business needs into technical designs and incremental deliverables; strong troubleshooting and communication skills.
- Asset: exposure to AI/ML development workflows (Python, data prep, prompt design, vector search, etc.); ability to partner with data/AI specialists and embed their outputs in Java services.
Desirable Skills:
- Integration of AI assistants / copilots / LLM features (for example: routing a user request from a Java service to Azure OpenAI, Copilot, Bedrock, etc.).
- Retrieval-augmented generation patterns (prompt construction, grounding with vector stores such as FAISS, pgvector, Azure AI Search).
- Experience with analytics and data visualization tools (Power BI, Looker, or Tableau) to surface operational and model KPIs.
- Understanding of data governance and quality frameworks (metadata management, lineage, audit trails).
- Experience in case management / benefits administration domains (for example, Curam or similar social services platforms).
- Experience with secure handling of sensitive client data (privacy, masking, role-based access, audit trails).
Requirements
Experience and Skill Set Requirements:
Must-haves:
- 7+ years hands-on Java development in an enterprise environment, including Spring Boot, REST API design, integration patterns, and production support / incident management.
- Strong SQL and data handling expertise: capable of analyzing schemas, building optimized queries, integrating APIs with data stores, and enforcing data quality in service logic.
- Proven experience supporting applications in production: triaging defects, analyzing incident root cause, applying hotfixes, improving resiliency and performance.
- Ability to consume and operationalize AI services: call LLM endpoints, handle prompt/response patterns, enforce guardrails, and log usage safely.
- Practical understanding of core ML / LLM concepts (supervised vs unsupervised learning, prompt engineering, retrieval, drift) sufficient to collaborate with data/AI teams and ship AI-enabled features.
- Comfort working in a secure, governed environment (privacy, PII protection, access control, auditability).
Skill Set Requirements:
Technical Expertise:
- Enterprise Java delivery: 7+ years building secure, scalable services and APIs using Java and Spring Boot in a production environment.
- SQL, data access & integration: Strong experience working with relational databases (Oracle, SQL Server, MySQL), including complex joins, query optimization, data integrity enforcement, and schema-driven design. Ability to collaborate with data teams on modeling, ETL, and API-based data integration.
- Data engineering collaboration: Practical understanding of data pipelines, transformations, and validation workflows that support service reliability and analytics. Experience building data-driven logic in applications (e.g., caching, persistence, aggregation, or event-driven updates).
- Production support & incident management: Proven track record diagnosing and resolving application issues across dev/test/prod; strong root-cause analysis, defect remediation, hotfix coordination, and performance tuning using logs, metrics, and APM tools.
- API design & integration: Design, build, and consume REST services; manage authentication, secrets, and payload validation; integrate with internal and external systems including data and AI services.
- CI/CD & engineering discipline: Hands-on experience with Git, automated testing, code reviews, and build/deploy pipelines; containerization using Docker. Experience deploying to managed runtime platforms or Kubernetes is an asset.
- Secure development: Ability to build services with proper access control, auditing, error handling, and resiliency; familiarity with privacy, data governance, and PII protection requirements.
- Documentation & standards: Produces clear technical documentation, follows architectural guidance, and contributes to shared patterns and reusable components.
- AI/ML platform integration (20% of role): Ability to safely call AI services (e.g., Azure OpenAI, Bedrock, Copilot) from Java applications, handle prompt/response patterns, and apply guardrails for safety, privacy, and auditability.
- Foundational AI skills: Familiarity with modern LLM and retrieval-augmented generation patterns (prompt construction, retrieval via vector stores such as FAISS, pgvector, or Azure AI Search, tool/function calling, basic fine-tuning/LoRA).
- Data handling for AI features: Ability to work with structured and unstructured data, perform quality checks, and manage feature-ready datasets that power AI-driven functionality.
Methodology, Testing & Troubleshooting:
- Agile delivery: Comfortable working in iterative sprints with product owners, QA, architects, data engineers, and business partners; able to refine requirements into deliverable increments.
- Quality mindset: Designs and writes unit, integration, and data-validation tests; supports automated regression and non-functional testing (performance, stability).
- Structured problem solving: Strong debugging discipline; able to analyze code, logs, and data flows to propose pragmatic solutions and identify when AI or data-driven automation adds business value.
- Risk & issue management: Anticipates delivery and production risks, including data integrity issues; raises them early and drives mitigation actions.
- Communication & teamwork: Clear written and verbal communication; able to lead or contribute to design discussions, walkthroughs, and knowledge transfer sessions across development and data teams.
- business cases, system documentation, and user manuals for diverse audiences.
Compétences linguistiques
- English
Avis aux utilisateurs
Cette offre provient d’une plateforme partenaire de TieTalent. Cliquez sur « Postuler maintenant » pour soumettre votre candidature directement sur leur site.